How IT Works

By | October 14, 2012

Before I got into “Enterprise” and Government computer work, I would daydream about corporate computer systems. I imagined them being similar to what we see in the movies: awesome firewalls, and control rooms full of blinking lights, run by conscientious, dedicated sysadmins.

After years working on and in these systems, I know the truth. The truth is, your average large-scale computer system is clunky, probably held together by text files and FTP transfers, and nothing much more than one giant short-cut. This is evidenced by Keith Ng’s recent discovery at the Ministry of Social Development.

IT projects that start out with the best of intentions get ramrodded into a column of compressed bullshit in so many ways it’s depressing. It starts with the original tender process whereby competing companies under-bid each other in an almost complete knowledge vacuum to win the right to build a system that’s already under tight marketing (or legal, or financial) deadlines.

In the response documents for these tenders, you’ll find the “security” section is almost always a copy-paste from the sales documentation for the product being used. Find a whitepaper about Microsoft SharePoint Security and copy-paste the relevant sections. Do the same for Oracle, SAP or IBM. You’re not paid to produce the tender response, so why spend more time than necessary on it right?

Then when the inevitable hurdles are encountered, every person presenting “proper” fixes is bashed into submission. “Ok then, give us two proposals, the ‘proper’ one, and a ‘shortcut’ with risks outlined”. Guess which one gets selected “because we have to meet deadlines”? Guess how many of the risks are addressed properly?

So yes, the MSD debacle is depressing. Not surprising, not alarming. Depressing. Someone, somewhere, knew what was going on. They knew that the kiosks were running with admin privileges, or that the unprivileged accounts on the network had too much access. They probably suggested putting a firewall between the kiosks and the network, but were told “hell no, we can’t afford a firewalled network port in every single MSD office”.

Or perhaps a junior staffer was sent out to set up the kiosks, and couldn’t get them working right, so she logged in with an admin password to make everything work. Maybe she called her manager and said “hey I’ve done it this way, I know it’s wrong, but I’m not sure how to fix it, can we book someone more senior to sort it out?”

Or perhaps someone requested that the Kiosk login account be added to the security group that permits internet access, and no one thought to check if that group also had access to thousands of files.

Perhaps even somewhere there’s an entry on a “risk analysis” spreadsheet that says “Kiosks have too little security and may allow unauthorised access (Risk likelihood: moderate, Risk impact: high)”, and this has been glanced over by an assistant to a CxO and signed off as OK. I doubt it.

I doubt we’ll ever hear exactly what went wrong. Rest assured however that holes like this exist in every single government department and corporate IT system in New Zealand.

Until we start from a culture of security and professionalism, nothing will change.

 

 

 

18 thoughts on “How IT Works

  1. Warren Clark

    Agreed. I’ve done some Govt IT work before and seen some absolutely dirty, dirty hacks deployed in the name of “just getting it done”.

    Reply
  2. Berend de Boer

    And this post is another reason why John Key’s proposal to put 30,000 kids in a database is a huge problem.

    Reply
    1. Dylan

      Especially when that database is intended to be accessed by a wide variety of largely unvetted users on a wide variety of networks.

      It will be virtually impossible to secure it.

      Reply
  3. Andre

    That’s been my experience, too. I’d add one additional factor though: as a result of the piecemeal approach to security, things like password management for end users becomes increasingly painful. (At my last job, I had *seventeen* different applications with passwords to manage, all of which had to be changed once a month.)

    Inevitably, when a system becomes that complicated, users look for ways
    to limit the amount of inconvenience to themselves, which results in easily guessable passwords, often stored on paper or in text files, or users obtaining admin access to applications to manage their own or their team’s regular password resets, etc.

    Inevitably security suffers in that environment, because users see security as an impediment to doing what they’re actually paid to do.

    Reply
    1. S

      Absolutely, when your security policies become onerous then your security becomes weak.

      Reply
      1. Koz

        Yeah, I wouldn’t be at all surprised if the files that are stored on open network shares are actually just staff working around a horrific IT system that they’re *meant* to be using.

        If so, the review will blame the staff for ‘circumventing policy’ without analyzing why that’s been done.

        Reply
  4. p

    yeah, very inspirational first words. I used to daydream about it too…

    Today, after coming home at 4am after spending the night at a gov’s office (not the MSD) doing deployments for a bunch of enterprise apps, I am not quite as excited anymore.

    I guess it is never too late to make a career shift, right? I am determined to work for myself in the coming year, and actually producing something that adds value to my life!

    Reply
  5. Qyiet

    I’ve had to fight for security several times in my job.

    It’s depressing that people who find concepts such as the address bar in a web browser complex get to make the final decisions on this stuff.

    More depressing is after demonstrating how terrible a current security practice is (able to gain full admin rights in a matter of hours from outside the org with simple brute force) the practice was left in place because ‘nothing bad had happened yet’.

    Pretty sure they didn’t even change the password on the account I got access to.

    Reply
  6. it-sec

    To be fair, I have encountered organisations where security is taken seriously and plenty of rigour applied.
    Usually there are two factors involved, (see what I did there :) ) a senior management team who buy into security and/or a regulatory compliance aspect, where security standards are mandated by external relationships and externally audited for compliance.

    In this particular instance the problems seem to be architectural, procedural and operational.

    In the first place, the whole kiosk solution architecture seems poorly designed for security from the get go. Why on earth did they need to be on the internal network at all if their purpose was simply to provide some information and a capability to edit and upload CV’s? An isolated network of dumb terminals with a web browser and an infopath form would more than suffice.
    Secondly why are they on the domain? And why log the user on with a domain account?

    I suspect that the account was simply a member of the default Domain Users group, rather than a local admin as Keith was able to do little more than access open shares, which also suggests that the default permissions are used on way too many internal servers.

    The whole concept of “Defence in Depth” as a fundamental architectural principle seems to be missing here.

    At the front line, even given the domain membership and user account, GPO’s and local policy should have locked things down much more tightly.

    Then, there should be network policy enforecement points between the kiosk networks and anything important.

    Moving further back, the server security seems pretty lax.

    There are a lot of comments about the kiosks having been audited. I’m going to reserve judgement until I know more about the scope of the audit and just how much they were allowed to look at.

    An audit of that type is really only a point in time snapshot.

    It may well be that the kiosks were configured in a more secure fashion at the time of audit and things got slack after that.

    I’ve seen it before where the external auditors were limited to a very narrow scope, sometimes partly for fear of breaking things.

    Reply
    1. Lucy

      There’s one other factor, apart from regulatory requirement, that will ensure attention to security: money. If your company’s existence and profits depend on keeping information secure (especially, e.g., tech or financial institutions), you’ll pay to do it right the first time. Regulation generally works because it imposes large financial impacts on security failures.

      Reply
  7. rfdawn

    A frequent factor in dis-integrated IT security is using multiple IT contractors. I wonder how many MSD have used?

    Ah, let’s check the invoices!

    Reply
  8. sysadman

    It’s almost like you have been working for years in the company I am currently at.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *