Welcome Guest! The IOSH forums are a free resource to both members and non-members. Login or register to use them

Postings made by forum users are personal opinions. IOSH is not responsible for the content or accuracy of any of the information contained in forum postings. Please carefully consider any advice you receive.

Notification

Icon
Error

Options
Go to last post Go to first unread
Steve e ashton  
#1 Posted : 11 October 2016 16:37:42(UTC)
Rank: Super forum user
Steve e ashton

A massively interesting and intriguing article here:  https://www.theguardian.com/technology/2016/oct/11/crash-how-computers-are-setting-us-up-disaster?CMP=share_btn_fb

Deals with automation, citical performance, contingency planning, deskilling the workforce, risk perception and so many other aspects of risk management, and certainly gives pause for thought..  Genuinely worth a read.

A Kurdziel  
#2 Posted : 12 October 2016 08:32:22(UTC)
Rank: Super forum user
A Kurdziel

It’s interesting but not that new an idea, different human errors modes which James Reason mentions in his book and is also detailed in HSG48-Reducing error and influencing behaviour”. It’s just that modern technology makes this sort of error more common (or perhaps more obvious). A person is working in a skill or rules based mode-ie flying a plane fly by wire or back in 1977 monitoring a nuclear reactor at Three Mile Island and something happens   and the rules that they think apply no longer do and they have to use their knowledge and skill (which may be rusty due to a lack of practice or not even there as the job has been deskilled) to sort the problem out. In all of these situations there is a resistance to abandoning the internal model they have, describing the situation, so the actions they take will make the situation worse. Very scary.

A trivial example from few years ago: I was waiting for a rep from a company who were bidding to supply us with lone worker solutions. They said that they would arrive at 15:00 at our site. The time came and went and it was nearly home time. I contact the company’s office and they put me through to the rep and they said “I must have put the wrong name of your location into my sat nat., as a single word rather than two. The satnav sent me to the wrong location about 30 miles away”

“Oh I said, when did you realise that you were heading in the wrong direction?”

“When, I passed the turn off to York on the A1; about an hour ago. But I felt I had to follow the satnav, just to be sure. Will you still be in the office when i get there in about an hour?”

“No I said. Bye-bye and don’t bother coming back!”

And now we are thinking about driverless cars. Will they have a manual override function and how will we ensure that driver are ready to take over from the machine? Or will we just let the machines drive us over a cliff, just to be sure?

 

sadlass  
#3 Posted : 12 October 2016 13:07:13(UTC)
Rank: Forum user
sadlass

Edward Tenner book: "Why Things Bite Back: Technology and the Revenge of Unintended Consequences" - written in 1997 . . .  

Users browsing this topic
Guest
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.