How Can User Behavior Analytics Protect My Organization?
The human element is the hardest to control in cybersecurity. A network can have the best security system in the world, but if users leave the door unlocked, it won’t protect them. Whether it’s hijacked accounts or disgruntled employees bent on sabotage, improper insider action poses a serious danger. Conventional tools aren’t good at recognizing this kind of problem. They presume that what authorized users do is legitimate.
Spotting abnormal user behavior isn’t easy. People don’t always follow the same routine. They get unusual assignments. People explore the system out of curiosity, without any hostile intent. They succumb to the tempting question, “What happens if I do this?” Recognizing when it’s time to investigate requires subtle techniques. That’s what user behavior analytics (UBA) is about.
What is User Behavior Analytics, and how does it work?
It’s no longer sufficient to identify threats by looking for bit-pattern signatures. Too many new attacks appear regularly. Zero-day threats don’t have known signatures. They camouflage themselves and hide in unexpected places. Existing malware is periodically tweaked so it doesn’t leave the same signature.
The emphasis is now on identifying abnormal behavior. Behavior doesn’t necessarily mean ways that people act. It’s a matter of following requests and data flows. If there’s a request for sensitive information outside a user’s job scope, or if a large amount of data is moved somewhere that it doesn’t normally go, that’s unusual behavior. If a user downloads a gigabyte, when downloading less than 10 megabytes a day is the norm, that’s unusual. A UBA system takes user, network, and authentication logs as its information sources.
“Unusual” doesn’t always mean “suspicious.” Multiple factors affect how worrisome unusual behavior is. Does it involve access to sensitive information? Is it coming from an unusual IP address, especially a geographically remote one? Is the activity happening when the user should be asleep? Does the rate of activity suggest an automated script? The answers to these questions raise or lower the risk level of an action.
UBA starts by establishing a baseline of normal activity for each user. It uses machine learning techniques to weigh the factors and, if the deviation looks significant, issues an alert. If a user is assigned new responsibilities, the system needs to take them into account and establish a new baseline.
Even with those adjustments, a UBA system will issue some alerts when nothing devious is going on. Administrators will have to look briefly at each of them, perhaps ask the user some questions, and dismiss the report if there’s no further indication of trouble. If there’s a pattern of alerts, though, it’s time to look closer.
As part of a SOAR system, UBA helps it to provide quick remediation. If the configuration allows it, SOAR can act directly on high-risk UBA alerts. It will quarantine the account or otherwise neutralize the threat faster than an administrator could. In less urgent cases, access to the account could be temporarily restricted. The system may require the user to provide extra verification until the issue is cleared up.
Insider threats are a serious concern
Verizon’s Insider Threat Report found that 57% of database breaches were at least partially due to insider threats. Most of them involved people who didn’t have high-level positions. With the pandemic, more people are working from home, making it easier to engage in subterfuge without being noticed. Home computers are generally less secure than in-office systems, so there’s a greater chance of accounts being compromised.
Insiders aren’t necessarily people inside the office. They could be anyone with access privileges, including contractors and business partners. The 2013 Target data breach was the result of poor security practices and excessively broad access granted to a business partner’s systems.
Contractors can have divided loyalties. They’re present just long enough to do a job, and they might find it useful to pull in some extra money by stealing data and selling it. Employees about to quit and work for a competitor sometimes grab information for their new employer. People who are fired or compelled to quit could be out for revenge, and they might have time to act before their accounts are shut down.
Spotting questionable activities from these people can make the difference between stopping them in time and having to repair the damage. Conventional threat detection will overlook actions that don’t violate security restrictions. They understand only actions that users shouldn’t be able to take at the system security level, not ones that are outside the scope of their job. UBA can spot the latter before it’s too late.
Insider threats include unintentional compromise as well as malicious action. The actor could be an outsider who stole an employee’s password. Whether the employee is disgruntled or careless, the effect is much the same. The important thing is to spot the problem quickly and shut it down.
UBA is becoming UEBA
It isn’t just human users who engage in system activity and sometimes go astray. Malware, SQL injection, and other techniques can cause autonomous software to break with their normal patterns. A CMS, a background daemon, or an IoT device normally stays within predictable bounds. If it starts making requests like getting a dump of all users’ login credentials, that’s grounds for suspicion. A business partner with its own network counts as an entity if it has access to the company’s network.
UBA has evolved into UEBA — user and entity behavior analytics. Gartner introduced the latter term in 2015. The two terms are often used interchangeably.
From the operating system’s viewpoint, there isn’t a lot of difference between a user and a process. A user or an entity runs with a certain set of privileges and interacts with applications and services. Tracking the behavior of all “user-like” processes is a logical extension of the old concept.
The shift in naming reflects an expansion of scope. Whether it’s called UBA or UEBA, it’s no longer just about “user behavior” in the narrow sense, such as data theft and sabotage. It’s about any activity which doesn’t seem right in its context, whether it belongs to a user or an automatic process. Malware can run under a logged-in user, a server, or anywhere else.
UBA doesn’t exclude SIEM
The centerpiece of modern network security is Security Information and Event Management (SIEM). It collects information from many sources and assembles a picture that includes any indications of hostile activity. It’s able to find threats that aren’t easy to spot by looking at just one system at a time.
Some people are confused about the relationship between UBA and SIEM. Neither one is an alternative to the other. They serve different roles. UBA looks at one user or process at a time for deviant behavior. SIEM takes a holistic approach, finding threats that may involve multiple applications, servers, and network components. The two work well together. SIEM is more outward-focused, UBA more inward-focused. In the future, we can expect to see more SIEM systems incorporate UBA/UEBA functionality.
UBA results can add information to SIEM reports. They point at specific accounts and can help to pinpoint the source of a threat that SIEM discovers. This allows faster incident responses, such as quarantining the implicated account.
Machine learning enhances User Behavior Analytics
A slight change in apparent user behavior can be cause for suspicion, but it would be impractical and unpleasant to investigate every little deviation from the baseline. A system that relies on static rules will miss real problems, flag too many false positives, or both. Machine learning (ML) allows more accurate identification of user issues.
ML doesn’t just collect a one-time baseline but keeps an ongoing record. Each user’s normal usage patterns are different, and they change over time. Analysts give the system feedback by marking alerts as real or not. Slow, gradual shifts in usage are less likely to indicate trouble than sudden jumps. Long-time employees have earned more trust than new ones, and ML takes that into account and adjusts as employees’ records accumulate months and years without a security incident.
Patterns indicating abnormal usage include behavior changes correlated with different access modes. If unusual actions consistently originate from a different IP address and hours from normal work, ML will count them as more significant than an odd action in the middle of normal work at the office. It can connect a series of unusual events, each one barely worth noticing in itself, into a timeline describing a questionable pattern of actions.
What does UBA detect?
The kinds of abnormal behavior that user behavior analytics finds aren’t a static list. Different vendors take different approaches and the state of the art changes over time. These are some of the issues which a UBA or UEBA system looks for:
- Privilege escalation. Users or malware may look for tricks letting them do things they aren’t authorized to do. Software bugs sometimes mean the code doesn’t properly check for authorization before taking a privileged action. A user may be able to impersonate a more privileged user by stealing an access token. Linux users may have sudo privileges even though they shouldn’t; if they do, there are no limits on what they can do on that system. The creation of an account with sweeping privileges should be double-checked as a general principle.
- Bulk data requests. Normal processes don’t usually ask for a huge number of records at once. There are certain exceptions, of course, depending on the software in use. In many cases, though, a wild-card request for records is a sign of attempted data theft. Too often it doesn’t take any ingenuity; employees commonly have unhindered access to sensitive data which they have no reason to touch.
- Abnormal data modification. What constitutes abnormal changes is a complicated issue. Large-scale bulk modification is usually suspicious. Modification of every file in a directory is highly suspicious and could indicate ransomware at work.
- High-volume or unusual downloading. If a download is from a user’s own directory, it probably indicates nothing more sinister than a backup. But if it’s from a source that holds sensitive information, it could be data theft in progress.
- Repeated attempts at prohibited actions. This could include accessing directories that the user isn’t allowed to touch, or database operations like deleting tables that aren’t authorized for the user. On a small scale, it most likely signifies user error or curiosity, but if it becomes a pattern, malicious action is possible.
- Installing unauthorized software. Whether this is an issue depends on company policies. If there are restrictions on what employees can install, an attempt to download and install applications not on the approved list could indicate bad judgment, at a minimum.
- Data transfer to an outside channel. There are many cases where this is legitimate and normal. If the transfer is to an IP address that normally isn’t used, it can mean trouble. The amount and nature of the data being moved are factors in deciding if there’s a serious problem.
- Modification of vendor-supplied executable files. If WordPress core files are changed, other than when performing an installation or upgrade, that’s almost always a cause for concern. There’s a good chance that malware has infected the user’s account.
User behavior analytics takes the user’s role into account. Administrators do things that normal users have no reason to do. Developers write code that does all kinds of strange things when they are testing it, but it shouldn’t do them to production systems. At the same time, administrators and developers are in a position to do more harm than the average user, so anything that triggers an alert on their accounts should be taken seriously.
Each reported anomaly gets a score. Ones with a very low score usually need just a quick look. Unless other factors call for suspicion, there’s probably no need to do anything. Machine learning systems will track users who generate a lot of low-score alerts and increase the score of later incidents involving them.
Some actions are routine but increase the score of anomalies. For instance, if a user changes to a new phone number for MFA, that’s not suspicious by itself, but if it’s immediately followed by logins from a distant country, it makes those logins more suspicious.
User Behavior Analytics requires good judgment
People do unusual things for legitimate reasons. Their job responsibilities may have changed, they may have a special task to finish, or they could be using new software. Machine learning techniques keep the false positive rate down, but even in the best case, there are alerts that don’t indicate problems.
A security operations center, or SOC, employs trained analysts who can sort through the reports they get and decide which ones need attention. Sometimes no action is needed. Sometimes an appropriate initial response is to note the account and keep an eye on it. Asking the flagged user a few questions may help, but people will get annoyed if they’re called too often for what seems like no good reason.
Sometimes, though, prompt action is essential. Disabling a suspicious account can stop misbehavior before it inflicts serious harm. This allows time to examine the incident and decide whether it’s malware, a hijacked account, employee malfeasance, or something else.
It’s a fine balance. A single deviation rarely triggers an alert, but hostile insiders try to stay under the radar. A pattern of low-level activity over an extended period may be the key to discovering illicit activity. The SOC can’t be too trigger-happy, but it has to investigate everything that could be a serious incident. Experience and intuition have to complement UBA software to handle incidents smoothly.
What should you look for in a UBA system?
A good UBA system uses state-of-the-art machine learning. This lets it adjust to legitimate variations in user behavior and avoid issuing too many meaningless alerts. It takes multiple factors into account when calculating the risk score of an event.
SIEM integration is a major plus. The two kinds of threat investigation work best when they’re combined.
UBA needs to comply with applicable privacy requirements. It shouldn’t let administrators snoop on personal email or browser usage as long as it doesn’t violate any policies. Information presented to analysts and administrators should not disclose personal information except when it’s necessary for investigating anomalies. The service contract has to guarantee that the provider won’t share its data with third parties or otherwise misuse it.
The coverage needs to be as broad as possible. Records of user activity are spread out over multiple logs belonging to different applications and types of access. The more logs UBA takes into account, the more complete its picture will be.
A UBA service needs the backing of expert analysts. Even with a design that limits false positives, analysts have to examine each plausible report and decide whether to follow it up. Amateurs will burn out on meaningless alerts and start to ignore the significant ones.
Businesses have to look out for threats from both the outside and the inside. UBA is an important component of security practices to detect hostile employee action, compromised accounts, and malware. BitLyft uses UBA in conjunction with SIEM as a service and an expert security operations center to give you the highest level of protection against threats to your data and systems. Contact us to talk to an expert and see a demonstration of BitLyft’s system in action.