Awareness

Chris Hoff took his three young girls to Source Boston with him last week.

First, VERY COOL and it sounds like they had a good time.

Second, it started some thoughts in his head, some conversations with others and the creation of something that will be most excellent.

HacKid Conferences

From the website:

The idea really revolves around providing an interactive, hands-on experience for kids and their parents which includes things like:

  • Low-impact martial arts/self-defense training
  • Online safety (kids and parents!)
  • How to deal with CyberBullies
  • Gaming competitions
  • Introduction to Programming
  • Basic to advanced network/application security
  • Hacking hardware and software for fun
  • Build a netbook
  • Make a podcast/vodcast
  • Lockpicking
  • Interactive robot building (Lego Mindstorms?)
  • Organic snacks and lunches
  • Website design/introduction to blogging
  • Meet law enforcement
  • Meet *real* security researchers ;)

I think this is an awesome effort.

If you have ideas or are interested in helping out, you can contact the group via @HacKidCon on twitter or via email at hackid@HacKid.org

-Kevin

{ 1 comment }

Name, Birthday, and Email…Why Not.

by kriggins on October 27, 2009

in Awareness

I post a lot of links in my daily bits post, but every once in a while I come across something that I think needs to be singled out. This is one of those occasions.

Graham Clueley of Sophos put up this post which I think is a must see. Not necessarily for those of us in the profession, but for our families and friends. Contained in the post is a video where they ask random strangers on the street for their full name, birthday and email address.

Check out what happens and then forward it on to those important to you to help drive home that they need to be careful with their information.

-Kevin

Reblog this post [with Zemanta]

{ 0 comments }

I had a Monster.com account hanging out there for a few years. I wasn't looking for a new position so all the privacy controls were turned on. Along comes the second data breach in under two years. I decided I didn't need that account anymore. I know, closing the barn door after the horse is already gone.

Anyway, I went to log into my account to have it removed and couldn't remember my password. No problem. I clicked on the 'Forgot my password' link and received a nice email with url in it to reset my password. Slight problem. The URL didn't point to an SSL encrypted page.

I decided to give them the benefit of the doubt by assuming I would be redirected to a secure page to actually reset my password. Nope. The reset page was also unencrypted. To reset my password I had to let it flit across the hostile internet in cleartext. I went ahead and did it since I was deleting the account anyway.

That made me a little curious and I decided to poke around a little more to see if anything else obvious popped up. Didn't take long.

The sign up page wich asks for your full name, email address, password, location and current employment status is also not encrypted. Once again, I decided to give them the benefit of the doubt and took a peak at the page source to see if maybe they posted the information to a secure page. Nope. At least not that I can find.

What this says to me is that there is a serious lack of understanding of information security in Monster.com's organization. If as basic a tenet as encrypting passwords when in transit and at rest is not understood and enforced, what else are they missing.

</hops off soap box>

-Kevin

Reblog this post [with Zemanta]

{ 1 comment }

Our Kids are in Danger!

by kriggins on December 16, 2008

in Awareness

According to a survey performed in 2006 by Cox Communications and the National Center for Missing and Exploited Children (NCMEC), 61% of children between the age of 13 and 17 have a personal profile on sites such as MySpace, Friendster, or Xanga. Half of them have posted pictures of themselves online.

That was 2006 folks. I’m willing to bet the numbers are even higher. From the same survey, our kids have experienced the following:

  • 71% reported receiving messages online from someone they don’t know.
  • 45% have been asked for personal information by someone they don’t know.
  • 30% have considered meeting someone that they’ve only talked to online
  • 14% have actually met a person face-to-face they they’ve only spoken to over the Internet (9% of 13-15s; 22% of 16-17s).

Not scary enough? How about these statistics from the Online Victimization of Youth: Five Years Later (2006):

  • More than one-third of youth Internet users (34%) saw sexual material online they did not want to see
  • Online harassment also increased to 9% of youth Internet users
  • Approximately 1 in 7 (13%) received unwanted sexual solicitations

So what can we do about this other than to ban our children from using the internet?  Educate them. Enter the NetSmartz program.

From the website:

The NetSmartz Workshop is an interactive, educational safety resource from the National Center for Missing & Exploited Children® (NCMEC) and Boys & Girls Clubs of America (BGCA) for children aged 5 to 17, parents, guardians, educators, and law enforcement that uses age-appropriate, 3-D activities to teach children how to stay safer on the Internet.

I learned about this program last week at the Infragard Cyber Sector meeting. It is a really neat program. They have developed several sets of materials that can be downloaded and used free of charge. The download page is here. In addition to the downloadable materials, there are many resources available on their website that provide even more information and tools.

I was not aware of this great resource until last week. Please help spread the word about it. Our children need to know how to protect themselves online and this seems like just about the best way to go about it I have seen yet. There is going to be a train-the-trainer type session at next month’s Cyber Sector meeting. I will bring this up again after I have attended that meeting.

Kevin

{ 0 comments }

RSA Europe 2008 – Day 3

by kriggins on October 29, 2008

in Awareness, Conferences

Today is the last day of RSA Europe 2008.  I have really enjoyed being here and have attended some very interesting sessions which I will be posting about in the near future.

Today's agenda is shortened since the last keynote ends at 13:30.  For those who are interested, here are the sessions I will be attending.

Lessons Learned from Société Générale - Preventing Future Fraud Losses Through Better Risk Management
Joseph Magee, Chief Technology Officer, Vigilant, LLC.
This session explores how information security technology could have detected the fraud in this case and how it can be used to prevent it in the future

Virtual HIPS are Growing - Whether You Like It or Not
Brian O'Higgins, CTO, Third Brigade
This session analyzes three approaches to virtualized intrusion prevention, inlcuding host iontrusion prevention systems.  It discusses the advantages and disadvantages in the management and architecture of each approach and incldes attack demonstrations on virtual machines.

Crash Course: How to become a Successful Online Fraudster
Uri Rivner, Head of New Technology, RSA, The Security Division of EMC

Learn how to defraud your favorite financial service! Uncover the latest tools, methods and best practices! Scalable Phishing techniques; Crimeware you can afford; Defeating 2-factor authentication. Or - if you happen to be on the other side - use these insights to develop a better strategy for protecting your consumers agains fraud.

Don't Bother about IPV6? Beware: It is Already in Your Networks
Andrew Herlands, Application Security Inc.
IPv6 is the next generation of IP addressing and is already enabled by default in several OSs: Microsoft Vista, Linux, etc.  Transition mechanisms are also in place and allow IPVv6 to run into tunnels over your esisting IPv4 network. This session explains the transition mechanisms, the threats and proposes mitigation techniques.

ICO - Higher Profile? Stronger Powers? More Effective"
Richard Thomas, Information Commissioner, Information Commisioners Office, U.K.
The landscape of information security is ever-evolving.  How can organisations learn from the mistakes of the past?  How do we manage the risks?  What does the future hold?  How is the role of the Information Commisioner's Office (ICO) being strengthened?  What will be the ICO's approach?  Richard Thomas will be discussing the lates developments and topical issues to answer these questions and more.

Security Cultures and Information Security
Baroness Pauline Neville-Jones, Shadow Security Minister, U.K.
Baroness Neville-Jones will assess the culteral problems in the Government's handling of data.  She will make clear the pressing need to improve leadership, governance and accountability structures for data handling.  She will also assess the threats to the infomation networks on which Government Departments and critical sectors depend and will cal for the Government to give concerted attention to the security of these networks and systems - as part of which it must develop partnerships with the private sector.

Have a great day!

Kevin

Technorati Tags:

{ 0 comments }

In the article "Study: Global information security improves, but still imperfect", Angela Moscaritolo points us at a report recently released by PriceWaterhouseCoopers, "Safeguarding the new currency of business."  The report is the findings of the 2008 Global State of Information Security Study®. Her article points out some salient issues found in the report, but I would like to focus on one particular issue.

On page 12 of the report, we find the following:

Finding #5
Many companies, however - if not most - do not know exactly where important data is located.

Other findings in the report indicate that we are doing better in implimenting technical controls and our compliance efforts also appear to be improving. But here is the rub, what value are better technical controls and a clean compliance report if you don't know where your sensitive data is?

Okay, we don't know where our data is. We need to find it. How do we do that?

Ask 10 information security professional that question and you will get 12 answers, all of them starting with "it depends." If we can't get a definitive answer from these folks, who can we get one from? How about the people who use that data each and every day?

Again, there are plenty of ways you could go about gathering that information from your user populace, many of which would be adequate.  But if we want better than adequate, I think Michael Santarcangelo gives us a great model for producing excellent results in his book Into the Breach.

You should get his book and read it as I have said before, but in short, engage your users in small groups and ask them how they do their jobs, in detail.  This will drive out where your data is. You may think your data is that big honking database, but what if a lot of it is in spreadsheets stored on a file server that you know nothing about?

This is a very simplified treatment of a great process that Michael details in his book. So, again, go get it. Read it. Twice. You will not regret it.

Kevin

{ 0 comments }

Umm..its not a technology problem.

by kriggins on August 1, 2008

in Awareness

Richard Stiennon says:

So, yes, there is good security awareness training. But I do not include teaching Bobby in reception how to avoid being taken in by Kevin Mitnick. It is futile and silly to expect your average employee to become paranoid enough to ward off social engineering attacks. Rather than invest in posters in the elevators exhorting people to stop strangers in the hallway, you should be investing in better security technology.

I do not agree.  Read the whole article and then come back here. I'll wait.

I've been reading Michael J. Santarcangelo, II's book Into the Breach. I was lucky enough to get a preview copy. I will be posting in more depth what I think of this wonderful book, but I do want to offer the following from the introduction:

We face a human problem where people are the the problem. The problem is that people have been unintentionally, but systematically, disconnected from the consequences of their decisions. As a direct result, they do not take responsibility and are not held accountable.

I agree that technical controls are important and should be implemented where appropriate. However, I disagree that providing awareness training to our people is a waste of time and resources. It can probably be done better, but it still needs to be done. How can we, as information security professionals, expect our users to treat information with due care if they are not aware of the importance of that information and the appropriate way in which to handle it? I submit that we cannot. We must, therefore, help them understand both the nature of the information they deal with on a daily basis and the way to handle that information that ensures that it is kept secure.

That's where I stand. I am really interested in your thoughts. What do you think about technical controls vs. awareness?

Kevin

Technorati Tags:

{ 3 comments }

Influencing our user community….

by kriggins on May 1, 2008

in Awareness, General

Mike Rothman in his latest Pragmatic CSO Newsletter (I highly recommend subscribing) has a really good post up about our responsibility to ensure that user community understands why they should be adhering to established policies and not attempting to circumvent controls put in place to protect our organizations.

I left the following comment and now am going to reuse it as a post :)

Mike,

I have been reading the book "Influencer: The Power to Change Anything" which I highly recommend. In it they posit that there are essentially six sources of Influence. They fall into two categories and what I call three strata. The categories are motivation and ability and the strata are personal, social and structural. Where motivation and personal intersect, the source of influence is defined as "Make the Undesirable Desirable."

If the general user community does not desire to adhere to or follow established policies and is actively attempting to circumvent controls, then we have failed to instill in them a desire to be compliant. It is our responsibility to influence them to change that mindset, in other words, to make the undesirable desirable.

So how do we do that? What you suggest exemplifies what the authors of the book have discovered. People are much more likely to embrace ideas when they have been shown the consequences of ignoring those ideas in a very personal and impactful way. I'm not saying that we should all use the specific scenario you suggest, although it would certainly bring
home the messages :), but we do need to find ways to instill awareness into our user communities that is much more personal than "read this policy and sign this paper."

Kevin Riggins

{ 1 comment }

Meaningful Conversation

by kriggins on March 24, 2008

in Awareness, Educational

Scott Young over at PickTheBrain writes in this post about a couple of ways to improve the quality of the conversations we have with people.

He points to two basic rules that can help make conversations more meaningful.

  1. The conversation is not about you.
  2. You need to give trust to get trust.

I will leave it you to explore his take on these two tenets from a general conversational perspective. However, it strikes me that if we, as Information Security professionals, would incorporate these rules into our conversations with our respective constituents, we might be met with a little less resistance. Of course, I am speaking from the perspective of being a corporate drone.

Having a conversation with the Information Security dude or dudette is viewed with a certain amount of trepidation by many who are "forced" to deal with us. In my experience, most of this trepidation is caused by us and not the poor supplicant :) Why do you think they feel this way? Let's look at number 1 above first.

1. The conversation is not about you.

Pretty simple statement. Harder to put into practice than it appears though. Let's change it a little; the conversation is about them. They are looking, whether they know it or not, for the best method of accomplishing their goal in the most secure manner available that is appropriate for the business risk they have chosen to accept. Which, by the way, is a topic for another post. If we approach things from this perspective, it becomes a collaborative endeavor, not an adversarial one. Of course, I am not suggesting that there will not be times when we are required to tell people they can't do something in the manner they desire. But as long as we avoid just saying no and try to help them find a way that is also acceptable from an infosec perspective, we have still remained their helper and not their roadblock.  If they view us as their helper, they will be less concerned when they need to talk to us.  They will involve us earlier and finally will be more likely to share more information with us.

2. You need to give trust to get trust.

This one is even more difficult. Why should they trust you? Do they know you? We have to build relationships with the people we work with. For those of us who work in the corporate world, this is a little easier. I talk to the same folks day after day and we have the opportunity to get to know each other and build trust.  I have to trust that they believe I have their best interests at heart and they have to trust that I am not out to "get them" or stop them for being successful.  Following rule 1 above goes along way towards building this trust.  Those who don't have the luxury of long term relationships with the folks you are dealing with have to find some way to establish that trust quickly and right at the beginning.  Again, approaching it from a rule 1 perspective will help a great deal.

So there is my two cents worth about something that has been a problem in several companies for which I have worked.

I have not done the subject matter justice, but it was on my mind so here it is.

{ 0 comments }