Read Time: 6 Minutes
F-Secure cyber security advisor Erka Koivunen recently wrote about white-collar hacking.
Usually, we can identity five different types of hackers: hackers, hacktivists, criminals, governments, and extremists. These all seek to take full advantage of software by repurposing it to fit their needs in ways not originally realized. But, as Erka suggests, maybe it is time to add a sixth type of hacker – the business executives!
Why would we make this assertion?
Let’s take another look at cases where businesses have resorted to hacking-style practices in an effort to hide the true nature of their actions.
The natural starting point is the still-unfolding emissions scandal where Volkswagen and affiliated brands sold at least 11 million diesel cars equipped with software that could cheat emissions tests. It now seems evident that producing an environmentally friendly diesel engine is no easy feat. Instead of trying to push the limits of engineering, the company decided to examine ways they could cheat most easily. This, it now seems, started several years ago: Engine parts supplier Bosch warned Volkswagen already in 2007 about their concerns that a component of theirs may have been altered in an effort to display less-than-honest emission profiles under standardized test conditions.
Standardized testing, there you have it!
Let’s hear what Erka has to say about the topic:
If all you care about is to pass a test, why bend the laws of physics and develop something novel or effectively environmentally friendly. Why not just rig the test and invest in lobbying efforts to keep the testing practices stable and as predictable as possible.
Now, does this make you want to double-check the water consumption levels of your washing machine and energy efficiency of your fridge? Ever wondered how come PCI-compliant merchants and officially accredited government networks still get breached? They are all audited against a certain set of criteria that the hackers will not respect. In an adversarial game of software security, it is the hacker who rules, not the one writing the rules.
Speaking of writing rules
We recently came across a set of regulatory requirements for the procurement of telecommunications network equipment for Internet service providers in India. In an attempt to protect the Indian communications infrastructure from vulnerabilities, the government requires that the software must be demonstrably bug-free when delivered. Nothing wrong with that. But then, the law-maker proceeded to demand that the software cannot have vulnerabilities in the future either!
Think about that for a second.
Corporate executives are professionals in business risk management. Faced with a business scenario, they will seek to avoid, transfer or minimize the risks associated with the scenario. Risk of regulatory penalties is a regular item in C suite considerations. So it is only natural for a clever business executive to seek ways to hack their way through the rulebook. In the vast majority of cases what they do is of course legal. But not necessarily something they would be proud to state in public.
By requiring that software must not contain any bugs – known and unknown – you effectively incentify software vendors to stop searching for (let alone, disclosing) vulnerabilities in their software after they have shipped. Furthermore, you also force them to lawyer-up every time a potential vulnerability has been found. It would be curious to know if the Indian ISPs get security updates for their equipment. Ever? At worst, their effective security will be worse due to the overly ambitious regulation. Cheating? Debatable. Unfair? Maybe.
Now, getting back to Volkswagen. Having a hard time producing cars according to ambitious regulatory demands? Fail to comply and you will be banned from selling your cars in a lucrative market? Now, somebody comes up with an idea to simply fool the test and continue selling your non-compliant stuff like nothing happened. All it takes is a simple software fix to a component nobody ever takes a closer look at! Heck, let’s even put on a EULA that forbids customers from reverse-engineering the software! Sounds tempting, doesn’t it. Cheating? Definitely! But in a hacker kind of way.
“Tech sceptic / consumer advocate” Bob Sullivan spells it out in his article:
“In short, consumers have been hacked. Their cars’ software was doing things without their knowledge, just as if a virus writer had dropped a Trojan on their machines.”
Software life cycle
For a cheater that runs an immutable system like that of a typical car, there needs to be certain level of guarantee that the testing procedure does not change overnight. The hack is specific to a certain kind of test scenario and would be rendered ineffective should the procedure change. One can only imagine the volume and substance of the lobbying efforts by VW to keep the testing practices as stable and predictable as possible!
However, VW forgot one crucial thing that left them exposed not only to detection but also to expensive recalls. The one thing that any modern software company – and their fellow hackers – would have told them to fix right away. Namely, a mechanism for delivering software updates remotely.
Software has a relatively short half-life. Bugs will be found, functional requirements will evolve, and underlying platform components and environmental specifications will change. When the old code becomes a burden, the software needs to adapt.
Volkswagen should have realized that if their cheating software would ever be spotted, they would need to get rid of it. Fast. Millions of times over. This is where their risk calculations failed them miserably. When the scam had already begun to unravel, Volkswagen initially kept on claiming that the figures were merely a result of a minor discrepancy and a software error that could be fixed easily with a recall. It wasn’t until the EPA and the CARB threatened to withhold certification for its 2016 diesel models that Volkswagen admitted its wrongdoing in early September.
VW itself announced they have put aside €6.5 billion ($7.3 bn) to fix the software through a recall program and to settle other demands once they emerge. Some analysts expect the sum total to be eventually much bigger.
Such examples seem to imply that companies might be tempted to seek an unfair competitive edge through the ‘innovative’ use of software. This is a question of software assurance. Is it enough that these cases are caught only through data breaches or by chance when someone with a curious mind decides to take a closer look at their gear?
Putting more effort into finding vulnerabilities and purposely hidden features of software products will improve security for all. After all, the less zero-day vulnerabilities there are, the less opportunities there are for cyber criminals, spies and terrorists to exploit them.
Erka’s points make me think. With the Internet of Things becoming more and more an everyday phenomenon, are we, as citizens, and as human beings, ready for a world where technology, rather than making our lives easier and safer, can be a further threat?
What about businesses themselves – how widely would they be tempted to resort to insincerity to reach better business results? If this type of practice starts to become a standard way of doing business, it would actually almost compel other businesses to follow suit, or risk losing business – at least until such wrongdoings are found and published. Are we willing to tolerate doping in the form of software hacks or should we promote better testing?