menu

I have never done this before, but I received the below article and a note of commentary on it through an online community to which I belong. I am simply “re-blogging” both. I have removed the commentator’s name because the community is closed. I would love to hear from readers to see what you think.

I have called many times for better cyber personal hygiene and still believe we need to seek it, teach it, and require it as at least a partial mitigation element. I have even made the comment that “you cannot secure against stupid.” In short, I think the article is a good one and well-worth the read. It is educational and authoritative.

My colleague’s remarks in response to the article however gave me pause. A well respected cyber expert with governmental, private sector and academic credentials, he also makes some powerful points that balance the article, particularly its title. Should we not also be putting more pressure on the designers of our systems to “do it better”? I recommend reading both items.

Human Errors, Idiocy Fuel Hacking – Bloomberg

“The U.S. Department of Homeland Security ran a test this year to see how hard it was for hackers to corrupt workers and gain access to computer systems. Not very, it turned out.

Staff secretly dropped computer discs and USB thumb drives in the parking lots of government buildings and private contractors. Of those who picked them up, 60 percent plugged the devices into office computers, curious to see what they contained. If the drive or CD case had an official logo, 90 percent were installed.

The test showed something computer security experts have long known: Humans are the weak link in the fight to secure networks against sophisticated hackers. The intruders’ ability to exploit people’s vulnerabilities has tilted the odds in their favor and led to a spurt in cyber crimes.”

As noted in the article, some of the cyber techniques used to penetrate personal and business computers include:

Whale Phishing – Targeting phishing techniques at senior-level executives. Their computers may have access to far more sensitive information than an everyday employee. And as the article states, “Technology executives are attractive targets because their positions give them access to a trove of information, and they tend to believe they’re better protected from computer hackers than their employees.”

Faux Vixen – Thomas Ryan, a security specialist, created a fictional online persona, posing as an online-security analyst. Through this, he accessed e-mail addresses and bank accounts, received private documents, and took speaking requests and job offers from Google Inc. and Lockheed Martin Corp.

Tracking Executives – The international hacking group Anonymous released e-mails showing several high-profile attacks on various companies were also attacked by hackers somewhere in China. The Bloomberg article notes: “Lulz Security, known as LulzSec and made up of former members of Anonymous, announced June 25 it is disbanding after 50 days during which it claimed attacks on computers of the U.S. Senate, Public Broadcasting Service television network, and Central Intelligence Agency.”

New Products – Companies are offering products that resist social-engineered attacks, such as spear-phising or e-mail or social media attacks.

The article concludes with the hope that security efforts can keep up with the growing sophistication of the threats.

This is the commentary on the above article. It stuck me hard. Again, the writer is a real expert. He has fought the cyber battle in government, in the private sector and is a well-respected academic. I have great respect for his comments and opinions. He is much more than a simple pundit; he is an operator as well.

Commentary: “It’s easy for technologists to blame (L)users.

“Oh, now wait a minute. In just about any field of technological endeavor if it breaks or has unintended consequences of use we replace it and say bad design. Not so with computers. We blame the user and say they have to change. People want to drive cars at high rates of speed, drunk, high on crystal meth, listening to a bazillion watts of Lady Gaga, what do we do? We install 57 airbags and an in flight refueling module. Add more cow bell!

“Well maybe not.. BUT!! Consider the title of the article includes “idiocy”

“If the use case is broken the designer screwed up not the user. Just because a bunch of propeller heads can figure something out doesn’t make the use case right. If computers are inherently insecure because of the implementation then they were designed incorrectly. Only in the bastion of information technology arrogance would anybody make the case that people were idiots because the tools they’ve been given are the equivalent of unexploded ordinance with a label that says “please shake before opening”.

“Think about it. Incredibly intelligent people with skills and capabilities are then expected to be information warriors and thinking about the conceptual leaps of “not using a technology as designed because that is a risk”. What? A USB drive is meant to be plugged into a computer. That is how it is designed. Waffling information security experts aside, regardless of where it came from, the use case is plug the USB drive into the computer. It is useless otherwise. Telling people to not do what it was intended to do is going to always fail.I can see a response already being written about prophylactic computing, but I don’t personally have that kind of relationship with my personal computer and it isn’t the first thing on my mind when picking up a USB drive.

“Yes I picked on the USB drive, but email, websites, and word documents are all threat domains with horrible technological implementations and issues that have existed for DECADES. Listen, this whole we would have security if we could just ban the (L)user is great fodder. But, without the user we are out of business.

“Rip up and discard the old paradigm if you want computer security. If we keep pushing security through denial of service (egregious security policies) and to the decision level of the lowest common cognitive technology denominator nothing will change. We will continue to have poor information security and I shall have a job for life. Oh wait a second… Never mind all that. As you were! Continue doing everything exactly as we have for the last 40 some years.”

Dr. Steven Bucci is director of the Allison Center for Foreign Policy Studies at The Heritage Foundation. He was previously a lead consultant to IBM on cyber security policy. Bucci’s military and government service make him a recognized expert in the interagency process and defense of U.S. interests, particularly with regard to critical infrastructure and what he calls the productive interplay of government and the private sector. Read More