A couple years ago I got an email from my alma mater soliciting feedback for their "Professional Responsibilities in Engineering" course. When I took that course the focus was "you have a responsibility to do proper risk assessment and design beyond strict requirements so you don't kill people, and this responsibility to your fellow citizens is greater than a particular job or even your own career". In my opinion this is where an engineer's responsibility should start and stop: excel at your craft so people benefit from your work, and don't betray the trust the people implicitly place in you and your work.
They were trying to shoehorn "Diversity and Inclusion" into this course. I gave my feedback that a greater focus should be on analyzing and assessing risk (because this is a skill inadequately taught at pretty much all levels of engineering and is a core component of engineering where lives may be at stake), but I got the sense I was "shouting into a hurricane" despite the respect I have for the professor who taught the course.
Let's just say I'll become very nervous driving over bridges and flying in airplanes around the time I retire.
Before the Viaduct in Seattle closed whenever I drove under it there was always a thought in the back of my mind that said "if there's an earthquake right now my life ends" because it was the same design as the one that collapsed in Oakland in the '89 quake.
Isn't engineering a giant risk management excercise?
Yes. For everything (planes, trains, and automobiles) there will be some combination of failures where if it occurs everyone dies. The trick is to make that so unlikely to occur that you'd have to really fuck up for that to have happened. Which is why, when you read something like a plane crash report, it's never one thing that goes wrong but a combination of things (which sometimes occur over years) that leads to disaster.
There are very specific processes people use to assess risk in a formal way, but they aren't really taught in school. And even when you know how to do it there's a fair amount of subjectivity involved unless there are objective regulatory requirements you have to meet (in which case you don't want to ask how the "objective" regulatory requirements were determined any more than you would want to ask how the sausage you're eating was made).
A couple years ago I got an email from my alma mater soliciting feedback for their "Professional Responsibilities in Engineering" course. When I took that course the focus was "you have a responsibility to do proper risk assessment and design beyond strict requirements so you don't kill people, and this responsibility to your fellow citizens is greater than a particular job or even your own career". In my opinion this is where an engineer's responsibility should start and stop: excel at your craft so people benefit from your work, and don't betray the trust the people implicitly place in you and your work.
They were trying to shoehorn "Diversity and Inclusion" into this course. I gave my feedback that a greater focus should be on analyzing and assessing risk (because this is a skill inadequately taught at pretty much all levels of engineering and is a core component of engineering where lives may be at stake), but I got the sense I was "shouting into a hurricane" despite the respect I have for the professor who taught the course.
Let's just say I'll become very nervous driving over bridges and flying in airplanes around the time I retire.
Don't drive under any bridges either.
Before the Viaduct in Seattle closed whenever I drove under it there was always a thought in the back of my mind that said "if there's an earthquake right now my life ends" because it was the same design as the one that collapsed in Oakland in the '89 quake.
Isn't engineering a giant risk management excercise? Anyone can build a bridge, but only an engineer can build a bridge that barely stays up.
Yes. For everything (planes, trains, and automobiles) there will be some combination of failures where if it occurs everyone dies. The trick is to make that so unlikely to occur that you'd have to really fuck up for that to have happened. Which is why, when you read something like a plane crash report, it's never one thing that goes wrong but a combination of things (which sometimes occur over years) that leads to disaster.
There are very specific processes people use to assess risk in a formal way, but they aren't really taught in school. And even when you know how to do it there's a fair amount of subjectivity involved unless there are objective regulatory requirements you have to meet (in which case you don't want to ask how the "objective" regulatory requirements were determined any more than you would want to ask how the sausage you're eating was made).