Thursday, November 19, 2009

Virtualization is Real

I remember back-in-the-day when Virtual meant ‘almost,’ ‘simulated’ or ‘in essence’ as in, ‘I’m virtually there.’  Today, as it has made it’s way into computer terminology, it can mean actual or real things that are done over computers.  Virtualization has been the main enabler of Cloud Computing and has become an important tool for IT.  I recently attended the 2009 Cloud Computing and Virtualization Conference & Expo in Silicon Valley and wanted to share some of my observations.  The show has certainly grown from last year but still a nice small(er) conference with a lot of opportunity for good conversations.  Cloud ‘solutions’ seemed to dominate the talks even though there is still a lot of confusion about the Cloud with a good portion of participants appearing to be in the investigative/learning stage.  Many of the attendees were still just trying to understand the whole ‘cloud’ terminology and I felt like one of the most informed – which means there is still plenty of opportunity to educate folks.  Security was a big topic as you can imagine but this year it seemed like the presentations were focused on solving those fears instead of just listing them as inhibitors.
One of the sessions I enjoyed was ‘Cloud Security - It's Nothing New; It Changes Everything!’ (pdf) from Glenn Brunette, a Distinguished Engineer and Chief Security Architect at Sun Microsystems.  He first reviewed the hallmarks of information security: CIA, the Guiding Principals, Managing Risk and so forth and indicated that the Cloud doesn’t change any of that – there’s no difference in what drives security or the concepts, it’s the Implementation that is different.  So if the overall Security Services are the same, and if the traits are the same – what’s missing?  According to Glenn, the thing that Cloud Computing Security demands is: CONTEXT.

He reviewed some of the challenges facing Cloud Security:

Speed – the agility to quickly configure services.  Security is usually the last part of the architecture but how do you secure services and enforce them when units are getting spun up/down at a rapid pace. It’s an opportunity to re-think.  One thing Sun (and others) are starting to do is bake security best practices right into the image before sending it to the cloud. Why make the customer deal with securing the underlying system when the provider can build the needed security right into the image.  Pre-integration and assembly allows the customer to still deploy quickly but securely.

Scale – Today Security administrators deal with 10’s, 100’s, even 1000’s of servers but what happens when potentially tens of thousands of VM’s get spun up and they are not the same as they were an hour ago. Security assessments like Tripwire, while work, inject load and what if those servers are only up for 30 minutes?  How can you be sure what was up and offering content was secure?  One idea he offered was to have servers only live for 30 minutes then drop it and replace.  If someone did compromise the unit, they’d only have a few moments to do anything and then it’s wiped.  You can keep the logs but just replace the instance.  Or, use an Open Source equivalent every other time you load, so crooks can’t get a good feel for baseline system.

Assessability – anyone with a credit card can now deploy cloud services.  Maybe someone feels IT is too slow in deploying a particular service and decides to do it themselves.  They now have substantial resources available and not a lot of knowledge of current policies.  How can you be sure that the policies are enforced across the board on all deployments.

Transparency – Customer’s need a comfort level to know how the data is kept safe, how keys are managed, how do they constrain a problem in the cloud - essentially understanding the provider’s standards and processes.  There are more IT elements, more change events, more data and less control – that’s the fear.  The cloud makes these challenges more visible.

Consistency & Integrity – knowing the exact configuration of any machine at any time.
Key Management – this is a huge problem with providers. Doing a backup to the cloud (while keeping the keys close) is OK but if you intend to use that data then the keys also need to be stored in the cloud. Being able to do a fast recover can also require keys out there. Additional legal verbiage is what typically covers key management today.

Accountability – Service Level Agreements. SLA are not so strong on the provider end and customers often need to negotiate this area.

Compliance – auditors.

There are changing architectural strategies in the cloud. Tight Integration becomes Dynamic Assembly; Inspections become Telemetry of Objects; Repair & Recover turns to Recognize & Restart; and Log Scraping becomes Analytics. You just need to change some of the old habits. Opportunities exist for standardization but in the meantime, get to a manageable set of things that need to be done and build upon the automation. Glenn closed with his Cloud Security Rules:
  • Embrace Security Systematically
  • Design for High Survivability (fight thru)
  • Compartmentalize failure (nodes going down)
  • Minimize Trust Boundaries (how far does the data go)
Good advice.
Related Resources

    Friday, November 13, 2009

    You’ve Taken That Out of Context

    Hello and Welcome to the new hit Game Show: You’ve Taken that Out of Context!  Hilarity ensues in this action packed half-hour when contestants try to deliver the appropriate resources to end users depending on several factors and circumstances.  So let’s get right to it: Our first contestant is Danny, an IT Director from Boston and he’s getting his first request…..OK, user is coming from a home computer, without a certificate, from a broadband connection and is a partner – what are you going to give them Danny?  Wow, Excellent!  You’ve provided a simple web application, delivered through a reverse proxy so he can enter his time & materials expense report.  Great decision, Danny!  Our next contestant hails from Chicago and runs a data center for a large manufacturer, please welcome Greg.  Whoop, here comes Greg’s request…..User is a trusted employee in sales needing to enter customer info, using an IT issued laptop with specific reg-keys and updates but working from a wireless network.  How you going to handle it Greg?  Nice move!  Offering them not only their specific order entry application that’s optimized but also giving them a connection to Exchange so they can download their email to stay current.  Sweet – keeping users productive while on the road – great work.  And our last contestant comes from Texas where he’s the Network Engineer for  a distribution company – round of applause for Tom!  Alright Tom, let’s see your request.  It’s coming fast, user is a vendor who needs to see inventory levels.  They are coming from their corporate LAN on an IT issued computer and does have a certificate for certain applications.   Whatcha gonna do Tom?  A full Layer 3 network connected tunnel?  Well, let’s see.  They get connected, they are navigating to their favorite app, so far so good, and logging in, cool.  Wait, what’s this – the user has initiated a sniffer and found some financial docs.  Oh no!  He’s downloading the latest financial statements that aren’t public!  That spreadsheet has much of our sensitive data but it’s too late, they are long gone along with your data.  Sorry Tom, a little too generous with that but you do get a copy of our home game where players act out partial scenes and you have to guess the context!  Thanks for playing.

    User Centric or Contextual Aware Computing is finally starting to gain  some traction partially driven by cloud computing.  User Centric or Contextual Based networking is simply Adaptive Access using intelligence to dynamically change the security applied to a specific access request based on the context of that request, the resources being accessed and the policy applied between the two.  The goal is to provide a unified method of applying security and delivering applications regardless of the actual security in effect, the network or the device being used to request access.  It’s access security based on user, device, location and integrity both at the time of the request and the duration of access.  It provides intelligence, adaptability and auditability for every user, every time.  It is about the environment or conditions surrounding an event and  informs us about it. With that information, we may perceive something differently which might change our view and maybe our decisions.  It’s about seeing the bigger picture and making better decisions by comparing the information we have about the request along with the requirements of the application and policies in place to deliver the proper access.  Garner calls this the ‘Digital Me.’

    Gartner predicts that by 2012, there will be more than 7.3 billion networked devices worldwide and 298 million subscribers of location-based services.  This is more than just delivering secure applications, it’s also about delivering the right resources to the right user at the right time.  More than ever users are dispersed all over the globe, arriving from a multitude of devices and networks while requesting access and information from your systems.  You need to offer the proper access to that user in a quick, secure and efficient manner with the proper controls.  You need to make the right decisions based on that moment of information as we move from Identity (user/password with some customization) based to Contextual (Identity plus a whole lot more) based delivery models.  You need to ensure that no-one is coming in or taking anything out, without context.

    Related Blogs

    Friday, November 6, 2009

    IPv6 and the End of the World

    There’s always been a certain amount of conspiracy theories when security type events happen or instances where there is secrecy. There are those who don’t buy the ‘reported’ reason a security event (like a breach) occurred, those who claim to have inside information or just those who see a story and draw their own conclusions. The following is my take (Satire Alert) on Transmission Control Protocol/Internet Protocol v6 and the end of the world as we know it. That can affect our security, right?!?

    Recently there have been more than the usual number of articles about IPv6 and the need to deploy it soon since the v4 blocks are almost gone. Yes we’ve been hearing this for years (RFC2460 was defined in December 1998) but now the hype may be over as indicated in this article. There are many security enhancements in v6 nicely covered here but that’s not where I’m going.

    In my first blog post on DevCentral, aptly titled First Post, I introduced psilva’s prophecies. I’ve been in the Internet industry since ’94 and while not a ‘know it all’ I have seen my share of changes and have seen a bunch of ‘ideas’ over time come true. For instance, I had always thought that the Internet would eventually become our entertainment delivery method and some 14 years later, that’s the case. That’s not that wild as I’m sure many of you figured it was only a matter of time once we started to see streaming video and broadband to the home. In that First Post, I offered my prediction of how our nomenclature might change over the next 50-100 years. That now, we no longer give our full name/address for contacting/correspondence as we’ve done in the past – we just give email. The idea was that over time, our current first/last naming convention might dissolve to where we are known as users@domains or a single string of characters. Twitter is enforcing that with their @namingconventions.

    IPv6, at 128-bits (v4 is 32-bit), gives us the ability to assign an IP address to just about anything – heck, all the portable mobile devices we carry each need one and consumer appliances like TVs, refrigerators, thermostat, DVRs, garage door openers, coffee machines and just about any electronic item could potentially have an IP address. Schedule your toaster via a Web GUI to perfectly brown your bagel when you get home. You can already control your lights and alarm systems over the internet. In addition, each one of us, worldwide, would be able to have our own personal IP address that would follow us anywhere.  Hold on, I’m getting a call through my earring but first must authenticate with the chip in my earlobe. That same chip, after checking my print and pulse, would open the garage, unlock the doors, disable the home alarm, turn on the heat and start the microwave for a nice hot meal as soon as I enter. I could chip my child (like the dog) to be able to GPS their behind if they are not at the movies as indicated. Not so farfetched. That doesn’t sound so sinister, psilva, how can that be the beginning of the end?

    OK, now the fun begins.  While not a Nostradamus follower, although  History/Discovery Channels have covered him often, he does have something to say about numbers. You might remember he got a lot of press and was the subject of spam after 9/11 due to this quatrain which his followers say indicates that he predicted that disaster. Conspiracy? He was very much into numbers and also indicated that when we are all identified as numbers, that will be an sign of the impending doom. We do have a numbering system in the states called a Social Security Number, which is our Gov’t identity and very much linked to our own security. With IPv6, now the entire world can be identified by number and thus fulfills psilva’s prophecy #2.  The timing is right also.  2012 is getting a lot of play as the end of time.  Both the Mayans and Nostradamus feel that 2012 is the end of days and Hollywood has taken notice.  Now this does slightly negate my 1st prophecy since I’m giving our name change around 50 years but 2012 does sound about right for a full IPv6 transformation so it does fit nicely with doomsayers – if you’re into conspiracies.