Complexity in the Stack Is Slowing Down Decisions
By Cyrille Badeau, Vice President of EMEA
The Stack Expanded but Decision Speed Did Not
Security environments did not become complex by design. They evolved incrementally. Each tool addressed a gap in detection, visibility, or response.
Over time, the architecture expanded, but the system was never designed to operate as a single decision layer.
Data moves between systems, but context does not consistently follow. Alerts surface without full entity history. Intelligence exists, but it is not always applied at the point where decisions are made. Relationships between users, systems, and events are distributed across tools. We end up with a consistent pattern across environments. Detection has improved but decision speed has not.
At enterprise scale, that gap defines performance.
The Investigation Tax
A large portion of investigative effort sits outside defined workflows. It exists in the space between systems.
Analysts move between identity data, endpoint telemetry, ticketing systems, and intelligence platforms. Signals are revalidated. Entity relationships are reconstructed. Context is rebuilt step by step before any decision can be made. We now introduce what can be described as an investigation tax. Time and effort spent assembling information rather than evaluating it.
In most environments, it involves multiple pivots across systems for a single incident. Each transition increases latency and reduces consistency.
From a UEBA perspective, risk is calculated at the entity level across time and behavior. When the underlying data is fragmented, that risk becomes harder to interpret and act on. From a CTI perspective, intelligence is available, but it is not consistently applied within the same analytical context, limiting its contribution to prioritization.
The system contains the data required to make a decision. It does not present it in a form that supports one.
More Data, Same Constraints.
Expanding the stack increases coverage and data volume. It does not resolve the underlying constraint. Information is distributed across systems with different schemas, enrichment levels, and timelines. Data normalization varies. Correlation requires manual effort or multiple queries. Analysts have access to more signals, but not to a unified view of risk.
Detection quality depends on context. Behavioral analytics requires consistent data to establish baselines and detect meaningful deviation. Threat intelligence contributes when it is structured, scored, and applied alongside entity behavior. When these elements exist in separate layers, their impact is reduced.
The constraint remains unchanged. Moving from data to a clear, defensible decision requires time, effort, and reconstruction.
Context as a System Property
In environments where context is preserved across the workflow, investigations begin with a different foundation.
Entity behavior, historical patterns, and intelligence are already connected. Risk is calculated before the analyst engages. Relationships between users, devices, and sessions are visible without additional correlation.
An analyst reviewing a credential misuse case, for example, does not need to assemble identity history, access patterns, and intelligence indicators from multiple systems. That context is already aligned. Work shifts toward validation and decision-making. Fewer transitions are required. Consistency improves because the same context is available across teams and locations.
Performance is tied to model quality and data integrity rather than manual effort.
Evaluate Architecture by Decision Performance
Security architecture is often evaluated based on coverage and capability. A more relevant measure is how the system performs during an investigation.
If architecture requires repeated correlation across tools, it introduces latency into every workflow. If context must be rebuilt manually, it limits scale and increases variability. If intelligence and behavioral signals are not aligned, confidence in prioritization decreases.
The system should be evaluated by how efficiently it supports decisions. How quickly context is available. How consistently risk is calculated. How reliably outcomes can be reproduced. Reducing complexity depends on how well the environment operates as a unified data and decision layer.
The Securonix Perspective
Detection, investigation, and response cannot stay separated if teams are expected to move at pace. Most teams already have the data they need. The difficulty sits in connecting it. Analysts move between systems, check multiple sources, and rebuild the same context repeatedly before deciding. That friction adds up quickly across larger environments.
Bringing behavioral analytics, threat intelligence, and telemetry into the same working layer removes a large part of that effort. Risk is calculated at the entity level ahead of time. Intelligence is applied in context. The analyst is working with a complete view rather than assembling one.
The workflow becomes more direct. Less time is spent gathering information. More time is spent evaluating it. Fewer handoffs. Fewer repeated steps.
Systems need to carry context from the first signal through to the final decision without breaking it apart. Teams that operate this way tend to move faster and stay consistent, even as their environments scale.