Thank you dear subscribers, we are overwhelmed with your response.
Your Turn is a unique section from ThePrint featuring points of view from its subscribers. If you are a subscriber, have a point of view, please send it to us. If not, do subscribe here: https://theprint.in/subscribe/
The Deviation from Established Norms
A procedural but polite conversation is taking place between the organisation and the employee. Certain patterns linked to his access credentials that came under review, are discussed with him. He downloaded certain files, logged in at odd hours, accessed information not directly tied to his current deliverables, and shared his access credentials with another team member of his. He explains there were certain deadlines to be met so there was no time foe caution. He downloaded the documents to keep an offline record, viewed some information out of curiosity, and logged in at unusual hours to clear a backlog.
There wasn’t any accusation yet, only concern about deviation from established norms. His answers were documented, and he was reminded of policies and compliances. As a precaution, some of his access was restricted.
Malice is an Intent to Harm
The deviations that later get classified as “malicious” do not usually begin with an intent to harm.
They may begin with an intent to dig deeper, or self-preservation, or both. Fear is a strong motivator. And the fear of organisational backlash for missing deadlines, or falling short on quality, or a long-awaited promotion can motivate us to deviate. Deviations therefore, are our graded response to pressure situations at workplace.
But the cybersecurity mindset is not trained to look at the intent, it is trained to focus at the final act: access credentials shared without authorisation, data copied without a clear need, online at unusual hours, and accessed information outside the normal scope.
A change starts setting in now, in perception and trust levels. Any by only investigating the act, the origin is misunderstood.
From Trust to Betrayal
In any organisation access is built on trust, familiarity, seniority, longevity, and past behaviour. Seniority expands access, tenure generates comfort, and the stability of past behaviour gives the system the confidence.
The first rupture is rather procedural than emotional. Usually it is an audit, or a compliance observation. Access, which was earlier a sign of trust, is now looked at as evidence. There may not be an accusation just as yet, but an uncertainty sets in.
There is pressure now; on the system as well as on the employee. The system wants to understand the deviations in normal behaviour. The employee does not know what the system is thinking. Self-preservation sets in as the employee begins to feel scrutinised, and the organisation begins to feel cautious. To the employee, escalation channels do not feel safe, and an isolation starts setting in. Nothing is still malicious, but a new rationalisation takes place as communication thins down. Actions that might have been taken for efficiency or while facing a dilemma, might now be reinterpreted as something else.
“This is unfair,” the employee thinks. “They deserve to be exposed and I must protect myself.” Data is copied as an insurance, or to be possibly shared later to protect oneself.
Some of the most well-known and consequential insider disclosures came from employees / contractors who were deeply seeded in their organisations, were highly skilled, understood institutional blind spots intimately, and had legitimate access.
They believed internal escalation had failed (happened at a major social media platform), or were convinced that organisational exposure was the only corrective action (an intelligence operative), or were destabilised (the breach at a major US financial corporation).
In each of these cases, their action was framed as betrayal which took long time coming.
Organisational Systems Measure Efficiency, not Emotions
Organisations are built for efficiency, so they measure performance, not stability, they audit transactions, not grievances. The empowerment provided to employees for effective decision making, makes it difficult for organisations to interpret deviant behaviour. Therefore, “something” has to happen before organisations detect insider compromise even if the environment for it has been created unintentionally.
Few organisations institute systems to detect accumulation of grievances. A software can flag an unusual download of data, but can it measure an erosion of belonging?
Post the breach, corrective actions are taken. Standard operating procedures are rewritten. But who is looking into the softer aspect of it, of what could have led the employee to this deviant behaviour? Where did the internal grievance mechanism fall short? Where did relationship become distant? What was the moral dilemma that prompted an employee to act?
The Insider was once a Believer
The most determined “insiders” are no strangers to their organisations. But when organisations begin appearing strange to them, distancing sets in. And organisations are unable to interpret it in time. It is always the relational failure before the security failure. By the time the relation turns hostile, it is too late for a U turn.
When an insider incident surfaces, organisations tend to isolate the act. Consequently, it is easier to condemn an employee than examine culture. But insiders are people who once belonged, who once believed.
None of this absolves either party of wrongdoing.
Nitish Bhushan writes on technology, relationships, and moral dilemmas, examining how human behaviour shapes consequences.
These pieces are being published as they have been received – they have not been edited/fact-checked by ThePrint.
