When incident detection vendor SecBI found suspicious activity on company devices at one of its clients, they passed on the data with the expectation that the client, a large European enterprise, would investigate further. That didn’t happen. The client’s security team was not allowed to look at the data due to privacy concerns.
A contract with the company’s employee union prohibited anyone in the organization from looking at employees’ personal data (e.g., browsing data, banking transactions, or healthcare provider interactions) stored on their work computers, even though they were owned by the company. Although SecBI’s data indicated possible bad behavior on the part of an employee, the company did not have sufficient cause to investigate under the terms of the union contract.
Here’s the kicker: The union used language from the EU’s General Data Protection Regulation (GDPR) in its contract with the company to keep it from accessing employees’ personal data on company devices. That put the company’s security team, itself part of the union, in an awkward position: The data showed a potential threat, but they could not confirm the threat without breaching the union contract. If there indeed was a data breach, they risked breaking the GDPR’s 72-hour reporting rule.
“This organization has a security operations center. It has tools and sensors to capture log data coming from the various devices deployed or assigned to employees, but the people in the SOC are very restricted from looking at the data being collected that is necessary to do their job, whether some laptop is compromised or somebody is misbehaving in a way that might pose a risk to the organization,” says Alex Vaystikh, CTO and cofounder at SecBI. “The organization is now struggling to balance between the privacy and the [GDPR requirement] to find and disclose compromise within 72 hours. They have a chicken-and-egg problem.”
The lesson here for every company struggling to meet GDPR compliance: Protect privacy, but don’t weaken your ability to detect and respond to threats in the process.
“A lot of people have a misconception about the EU standards even now for conducting appropriate reviews, monitoring, and following up on suspicious activity,” says Joan Antokol, founder and managing partner at Park Legal LLC and a member of the International Working Group on Data Protection in Telecommunications. “A company can really be in hot water if they don’t track [a threat] down [when data access is restricted due to privacy concerns]. In the meantime, more data is potentially being leaked because they haven’t blocked and remediated.”
A disconnect between privacy and IT security
“There is a disconnect in many organizations between privacy and compliance teams and IT professionals. They have to work together. Privacy and security go hand-in-hand. Like salt and pepper you can’t separate the two of them,” says Antokol.
“Sometimes the privacy professionals expect the security team to know what the regulations say and mean, which is often different than the plain language. Then they get frustrated when they don’t know.” Consequently, security teams apply the same type of logic they use within the context of their jobs, which doesn’t always apply.
New terminology used in the GDPR also contributes to the confusion. Personal health information (PHI), for example, is a U.S. term and not found in GDPR. “People are making unnecessary mistakes because of the language difference,” she says. “IT is looking for someone to just explain to them what needs to be done.”
In the meantime, IT and security teams have successfully lobbied for additional funds to meet GDPR compliance. “Never waste a good crisis,” said one CISO speaking at a recent conference.
What their organizations get in return might not be enough. “I see clients spending huge amounts of money on the IT side and privacy side and they’re getting crap back,” says Antokol. “They go to management and say they need $500,000 or $700,000 for GDPR. Then they use it on a consultant who gives them some canned document that they populate for lots of different clients. They are not weaving the GDPR into their corporate culture.”
The problem is that experts with skills and knowledge to properly advise on GDPR are in short supply. “Even some reputable firms are hiring people very quickly just to build GDPR staff, people who don’t necessarily have the credentials,” says Antokol. “Then there’s a cottage industry of other people out there who are just trying to capitalize on GDPR. There’s even someone who calls himself ‘the GDPR guy’. The regulators are laughing about it.”
Antokol sees companies struggle to assess outside GDPR talent. A good GDPR consultant will know the history of the regulation, understand the intent of the regulators, do a good gap analysis, and be able to help the company prioritize what it needs to do.
No excuses for the 72-hour rule
Hypothetically, the company in the above example could be at greater risk of a fine from the EU should that data they didn’t review result in a breach that exposes personally identifiable information (PII). EU regulators could start the 72-hour breach reporting clock to the time the data became available. “They could also be fined for violating the employees’ rights if they don’t have that side covered,” says Antokol.
GDPR’s 72-hour window for reporting a breach once discovered has spooked some companies. “With some CISOs, the frustration with GDPR causes them to sometimes lean toward being kept in the dark and ignorant of an attack until they can’t ignore it,” says Vaystikh. “They might do something to avoid discovering it early. He adds that some companies aren’t sure of how to report, and the lack the tools to orchestrate proper reporting within that window. “The old infosec tools are not a good fit for the regulation,” he says.
Restricting access to PII plays into hackers’ hands
If hackers know that a company restricts internal access to data on employee devices, they will target those devices as entry points to the system, Vaystikh believes. Typically, the hacker assumes that everything is being monitored. “Instead of establishing a back door at some server that the hacker is confident will be monitored, they can launch the attack from my machine. The chances of being caught are much, much smaller. Anywhere a hacker can hide amongst privacy-oriented resources, it will influence the way an attack is carried over,” he says.
6 best practices for protecting employee data
1. Keep calm and carry on. Overreacting out of fear of big fines can put an organization at greater risk. Antokol has seen many examples of this. “Many people are in a state of panic. The panic comes from articles and presentations at conferences that really hype [the fines] up. There’s some truth behind that, but [the big fines] are going to be in extreme cases and not for companies showing good faith and doing what they need to do.”
She sees parallels to HIPAA in the U.S. “Everybody was very concerned before April 2003 when the privacy rule went into effect. There was the same type of frenzy with people selling services for millions of dollars,” she says. “It isn’t like the regulators are going to come to your door May 25 and demand an on-the-spot inspection. There will be a gradual decline in the anxiety level as people become more comfortable with GDPR. It’s a law like any other law that they will eventually master and integrate into their ongoing business activities.”
Get employee consent to access their PII. “There are certain types of [personal] data that are protected under GDPR, and employees must give consent for processing,” says Mathew Keshav Lewis, global head of banking and regulatory practice at legal service provider Axiom. “Employees typically give consent when they on-board to a new employer or during an annual refresh of compliance policies and procedures, so that the company can protect itself. GDPR becomes one more reason for this formal consent process.”
Lewis notes that while there is nothing in the GDPR that explicitly refers to non-business-related employee PII stored on company devices, it’s still important to consider this data as part of a consent process. “If an employee does not consent, then the firm may need to restrict access to systems and make a determination as to whether that person can continue in the role,” says Lewis.
Lewis also points out that consent is not the only basis for processing, and employers should consider others. For example, it might be necessary to process for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract, or for compliance with a legal obligation to which the controller is subject as viable bases under the employment contract.
3. Limit the amount of data analyzed. Part of the privacy problem is the amount of data that security analysts need to sift through to find the bits relevant to a potential threat. Vaystikh sees the trend toward greater privacy protections clashing with corporate tendencies to collect as much data as possible in the name of security. That conflict is embodied in the GDPR itself. “The tension between GDPR fast detection and disclosure versus privacy is interesting. We all want privacy, and we all agree that too much information about us is collected.”
Limiting the scope of data accessed would ease the concerns of companies with a conservative approach to protecting PII, like the company mentioned above. Technology can help. “The solution is to limit the number of logs and analysis to go over [for security to do its] job,” says Vaystikh. Tools based on artificial intelligence can help security teams process and summarize massive logs, and then hand only the information with the highest probability of indicating malicious activity to a human analyst. “This information is extremely focused in terms of scope, and it can run on-premise and by the organization so no third party can look at this information,” he says.
4. Create and communicate clear policies. Organizations are obligated to investigate all potential breaches “quickly and in the right way,” says Antokol, even if that means looking at employees’ personal data wherever it resides on the system. She urges companies to create policies and controls around how they conduct those investigations.
Lewis notes that GDPR gives companies the right to investigate “if necessary in the legitimate interest to be pursued by the controller or a third party,” quoting relevant text from the GDPR. “If you can establish a legitimate interest, pointing to a regulation looking to protect against fraud, you can pursue what you need to. You just need a very clear framework under which you’re operating, and employees need to be aware of that framework,” he says.
5. Hire qualified outside help. Antokol suggests that organizations should have a written agreement with any outside GDPR consultant that confirms the knowledge of the organization. It’s appropriate to ask for indemnification and confirmation that the assistance and materials they receive will be able to withstand an audit.
“Ask how much expertise they have in privacy, in EU privacy, and in GDPR,” says Antokol. Ask who will be doing the work, too. “You have sales people recruiting clients, and then a different group of untrained associates many times doing the work.” Agree on the time commitment that the experts will put in.
6. Get a data protection officer (DPO). A good DPO will understand the GDPR and help keep compliance priorities in proper perspective. DPO-as-a-service is a viable option. “A lot of the organizations I’ve dealt with providing DPOs are actually helping,” says Antokol. “There’s a shortage of DPOs, so those companies [providing DPOs] are providing a legitimate service.”