However, the task gets more tricky now that there's a growing BYOD (Bring Your Own Device) culture sweeping the office. When people use their own devices, what right has the corporation got over the device? I think it comes down to two basics:
- Does the device attach to the company network, i.e. actually sit inside the corporate firewalls?
- Does the device hold, store or manipulate company data, i.e. data that is owned by the company?
If you can answer "Yes" to either of those questions then regardless of who owns the device, you and your company must insist that the device has certain security capabilities implemented. I would insist that they have:
- User-ID/Password Secured Login: No computer should hold company data that isn't at least secured from prying eyes by a password to log in. This goes for mobile devices as well as computers - most now have the ability to use either a passcode or swipe-code (not just the basic swipe) to enable access.
- Anti-Virus/Mal-Ware & Firewall: Another basic, but many people don't bother. For Windows based devices, it's absolutely essential, but even on devices that are supposed to be safe (Apple/Linux), you should probably insist on it.
- Data Encryption: This should go hand-in-hand with the system log in requirement. Most systems have the capability to encrypt data, it should be insisted upon.
If the person who's device it is declines to implement these requirements, then you must decline access to your company's network and data, it's as simple as that. The IT department needs air-cover from the CEO to ensure people don't creep round to the back-door and get access by pulling a favour.
If you want to go a few steps further, I'd suggest two other requirements:
- Tracking Software: Individuals can implement tracking software on their devices very easily, many are free private use. Prey being a good example.
- Remote Wipe: More devices are now getting the ability to instigate a remote wipe, but there are also 3rd party applications that will do the remote wipe for you.