Where most people see all sorts of lovely functionalities when looking at a new product, the security professional sees possible ways of abuse. The security mindset is one of the key characteristics of engineers in the security field: we do not look for solutions, we see failures.

Many offices have some sort of key card-based system to provide access control. These systems are quite neat: you can open the door with a simple badge, and the system knows who is trying to enter. From a functionality point of view, this is a nice combination of usability and accountability. However, if we look at this from a security point of view, we can ask ourselves several questions, for example: are these cards easily copied? As they operate wireless, do we even need physical access to copy one?

To give a different example, an old web application package used to redirect users that were not logged in and tried to access the administrative interface to the login page. Administrators could use this page to log in without any hassle. However, I wondered what would happen if I ignored any redirection requests. To my astonishment, the application did not provide an error, but simply showed the management console.

Testing to Fail, Not to Pass
Most systems are evaluated by testing whether they pass the functionality requirements. This is not so strange: these products are made to provide a solution and, thus, need to fulfil the requirements effectively.

However, security issues are commonly not found by testing for conformity. To find real security issues, one needs to think and act like an attacker, in other words, one needs to test for failure. Simply put, this involves asking yourself what would happen if I, for example, enter an unusually large amount of text, change a certain parameter, or try to execute a restricted internal command.

Stealing Notebooks for Science
Back in the first year of my Master’s education, we had a social engineering project. In short, several employees of the university got a notebook handed to them under some made up presumption. After this had been done, the students were tasked with finding a way to obtain the notebooks by using social engineering, i.e. hacking people, and acting like an attacker.

By looking at the world from the side of the adversary, the new students immediately get an understanding of the security mindset and how to apply it in specific situations. For example, we learned the receptionist of one of the buildings on campus was more than happy to open up one of the rooms for us to adorn it with birthday decorations. Of course, we used this knowledge to acquire the laptop.

Bruce’s Ants and Mail You Should Not Reply to
A classical example of the security mindset is the ant farm Bruce Schneier got when he was young. His ant farm did not yet contain ants. One had to order them by filling out a card and sending this by postal mail. Most people would argue that it is neat the company sends you fresh ants by mail. However, the security engineer wonders whether he could abuse the fact he could send a tube of ants to any random address.

In another case, someone registered the domain donotreply.com, which caused  him to receive all sorts of confidential details from people who replied to automated messages. Additionally, bounced messages intended for non-existent e-mail addresses also ended up in his mailbox.

Take on the Evil Eyes!
There are many additional examples of how the security mindset makes you look different at the world. In practice, a lot of security professionals view the world every day with this mindset, causing them to wonder how they could shoplift, bypass airport security, or drive out of the parking lot without paying. However, normal engineers can benefit from taking on the adversarial mindset while solving a problem, too.

Leave a Reply

Your email address will not be published. Required fields are marked *