January 15, 2019

My 2018 SANS Kringlecon Holiday Hack Report

Here's my 2018 SANS Holiday Hack Kringlecon report that I received a Super Honorable Mention for!

U Can't Touch This - Thoughts on Data Integrity vs Confidentiality

Cannot Touch "This"

Intro

As I think about MC Hammer's song regarding the lack of write permissions in order to use the Linux "touch" command (see above screenshot), I can't help but think about the bigger implications of read vs write access.  I know, two blogs about song puns in a row.. 😩 The Principle of Least Privilege (sounds like an awful job title, by the way) teaches us that if write permission isn't necessary for a user's role, it should be restricted to at least read-only.  Having write access is inherently more dangerous than read access, as an attacker or careless user can modify essential files on an operating system or introduce malicious code.  Ask Peter Parker's uncle about the responsibility of having great power sometime.. oh, that's right..

This got me thinking about cyber security breaches and how the focus in the media is typically about data confidentiality, the disclosure of sensitive data (read access).  This is bad, no doubt.. and some businesses care most about keeping their secret sauce.. well, secret.  However, I'd like to think about another aspect of the CIA Triad, Integrity.  What if the attacker modifies the data (write access)?  What kind of damage could that do to an organization?  What can be done to detect this?  What businesses would be impacted most by this?  What's an MC hammer, exactly?

I Know You Touched My Data!


Unaltered & Authentic MC Hammer Message
Proof by Parachute Pants (PPP)


The integrity of data can be described in its simplest definition as original data that has not been altered or modified in any way that's unexpected.  Put simply, it hasn't been tampered with.  Integrity of data plays a role in data in transit as well as data at rest, although I'll be focusing primarily on data at rest during this article.  TLS and other encryption protocols hep to protect the integrity of data over the wire.  As we know, it's otherwise trivial to intercept data and modify it in real-time to redirect credit card information, credentials, and just about anything else imaginable to an attacker.   We want to be able to trust that the bank we're using online isn't being tampered with somewhere in the middle.

When I think about it, ensuring a message hasn't been tampered with goes back to even the earliest days in human history.  Wax seals like the one above (historically accurate in case you were wondering), were used with letters in medieval times, similar to public key cryptography, to ensure the integrity of the message and the authenticity of the sender.  Heck, my wife does this with her favorite pregnancy-induced chocolate pretzel cravings by turning them a special way that only she is privy to.  I'd have to pull an Indiana Jones and the Temple of Doom or Oceans Eleven to get to them without her knowing, so it's simply not worth the effort or risk for me.  Other examples include police crime scene tape or those stickers you see on gas pumps and electronic hardware to prevent tampering for warranty or safety purposes.  I've seen water reactive material in cell phones that show if you slam dunked your newest iPhone into the toilet.  All of these examples demonstrate an effort taken up front to protect the contents of a publicly accessible item and to prove without a doubt that they haven't been altered from their original state.

So how then, do we verify the integrity of our data if we can't place it into a metal snowflake pretzel container and turn it to 40 degrees North?


Ways of Checking and Verifying Digital Information


  • Backups / Version Control
One of the most basic ways of making sure your data hasn't been altered over time is to compare that to an original copy, preferably stored offline and protected.  If I attempted to replace chocolate pretzels with chocolate raisins, I'd expect my wife to notice the diff.  Speaking of diff (please, contain your laughter), version control works in much the same way backups do in that they allow you to compare the changes from an original copy to the current version to determine if our data is legit.

Diff Command Showing Changes Between Python Files (2 legit 2 quit)

When I'm working to help an organization prepare themselves for the possibility of a ransomware attack, one of the first questions I ask is, "Do you have backups?"  This is followed by, "How frequent are full backups and are they stored offsite?"  It's easier and more economical to eradicate the infestation and simply restore the original files than mess with attempting decryption in an incident response scenario later on, while trusting your data hasn't been altered.
  • CRC / Hashing / Signatures
Cyclic Redundancy Checks (CRC)  have been around since the early years of computing.  Their purpose is primarily to check for accidental changes or corruption to data.  Unreliable storage media, protocols, or networks are just some examples of how data can get corrupted.  I'm sure most of you at some point downloaded a file over multiple weeks with your 28kbps modem only to find out it was corrupted when executing it.  No?  Just me?  (If you haven't lately, kiss your broadband and tell it how much you love it.)

Hashing is similar, but computationally more complex than CRCs and are designed with security, uniqueness, and randomization in mind.  MD5, SHA1, SHA2, and SHA256 are just some hashing algorithms used today to create a sort of signature that represents a file in it's entirety.  Hashes can also be used in other aspects of security, but we're going to focus primarily on it's use in data integrity.  This is the preferred method today to verify the integrity of a file.  You have probably noticed on websites when you download a file, you may see a hash near the download button.  The reason for this is so that the user can verify the file once its downloaded that it is a one to one match, byte for byte copy of the intended file.  Often firmware will verify a file's hash against an online database before flashing, since corrupted binaries can brick, or render useless, a device.  
  • Metadata
Although not generally a great way to check file integrity, sometimes metadata can be used to see if a file has been altered from its original state.  Attributes like file size and last modified are commonly used in operating systems to compare files for copy replace-type operations.  

Detection

  • File Integrity Monitoring (FIM)
There are a lot of software solutions available to monitor files for changes on Apple/Linux and Windows workstations.  Some of these, such as the open source fswatch, can be set up to notify when specified files are accessed or modified.  This allows the user to designate which files are more sensitive and ignore commonly written files, such as system or temporary files which are constantly being written to.  Data Classification may be an essential process in knowing which files are of most priority to monitor.  There are also solutions for enterprise environments, such as Tripwire, FireCompass, and ADAudit to name a few.  In addition to files, there are also several options available to monitor the contents of database servers.
  • Version Control
Several enterprise and consumer backup solutions now support version control, which keeps all records of file modifications and metadata.  These can sometimes be configured to alert when there's a new change, such as a push to a Git repository.  Doing a diff or similar file comparison between the current version and the previous will show the differences between the two files, thus protecting the integrity of the original with an audit chain.

Awesome!  These are great for checking a particular file at a point in time against the original, but what do we do if we don't have anything from the original to compare against?

Did You Do Anything While I Was Gone?




What if these methods aren't in place?  If there's no baseline of original, trusted files to compare against, there's no way to know if those files are untouched.  You can imagine how disconcerting it may be to have a security incident and have no way of knowing if your data was accessed or altered.  What if your business depended on the integrity of your data?  I imagine education facilities, health care institutions, financial and credit businesses, and government databases are just some examples of organizations that depend on data integrity.  How would it look if they lost confidence in the accuracy of their data?  It's one thing if your cousin's twelve-year-old changes their grades for their math test (not cool, by the way) but what if a hacktivist organization puts a warrant out for your arrest?  What if your credit history was destroyed or your bank account is wiped out by "Bob" or "Alice" from across the globe?

In my opinion, if a breach occurs and there's no way to verify the integrity of your data, it must be assumed that it's been tampered with.  That could destroy the reputation of an organization!  Therefore, there's a lot of importance placed on backups and auditing.  Not just for availability, as backups are often only thought to offer, but for integrity as well.

Sometimes changes are intentional.  Having a good change control policy in place (which is followed) is critical.  Being able to look at a change and verify that against an approved record can explain if it was altered by accident, via corruption, or was intentional and legitimate.  2 legitimate..

Possible Attacks

What kind of attacks could bad actors carry out, technically?  What would they hope to accomplish?  Just some examples off the top of my mind include adding physical access into a facility's electronic door system, altering banking records for financial gain, clearing legal records, disabling security cameras, rappelling down a rope into a bank vault.. er hum.. my apologies, I went a little too Hollywood on that one.  You get the idea though, it would be baaad if you couldn't detect and correct changes like these.  It would be Hammer Time for someone in that position.  😅

Conclusion

What can be done to protect files?  As with anything security, it's all about having layers of protection.  It begins with data classification and getting some kind of a data "snapshot" as a baseline.  Permissions and proper user roles help prevent overly open access which could lead to abuse or accidental modifications.  Lastly, file monitoring systems or at least a baseline of sensitive data gives you a gold standard to compare against.  Auditing and logging is essential, but certain levels may not be sufficient.  It's not enough to prove someone accessed a file if you can't prove it wasn't modified.

As always, I appreciate the time spent reading my crazy ramblings!  These blogs are not meant to be an be-all end-all definitive guide to infosec.  I'm hoping to learn from my readers with different opinions and thoughts.  Please provide feedback if you'd like in the comments section below! 😃

"Technology is important because it creates the future. We're able to be a part of the "next" and create things that don't exist." - MC Hammer (MC Hammer did not approve any content in this blog)

Thank you,
- Curtis Brazzell