Adopting a red team mindset to build effective incident response

Published October 15th 2025

The Danger of Complacency

Throughout history, misinformation and the absence of effective challenge in group settings has led to dangerously negative outcomes.

The Challenger and Columbia shuttle disasters: safety concerns raised by engineers were minimised or rationalised away by leadership teams under pressure to maintain launch schedules.

The COVID-19 pandemic: early misinterpretation of data and a failure to challenge optimistic modelling delayed responses in several nations.

Weapons of Mass Destruction and the invasion of Iraq: the Chilcot Inquiry concluded that the UK went to war in Iraq based on flawed intelligence assessments about Iraq possessing weapons of mass destruction. These assessments were accepted and acted upon without sufficient challenge, despite significant uncertainty in the underlying evidence. The intelligence regarding weapons of mass destruction was, according to Chilcot, ‘stated with misleading and unjustified certainty.’

In moments of crisis or complexity, consensus can feel like comfort, but comfort is rarely a sign that all is well. For leaders and organisations, blind spots form not from ignorance but from untested assumptions. Adopting a red team mindset may be the antidote: a way of thinking that invites dissent, tests assumptions, and acts before failure.

Crisis events, particularly when novel, often require innovative solutions. Creative thinking within a team means thinking more broadly and pivoting among different ideas to produce divergent solutions.

What Is Red Teaming?

According to the UK Ministry of Defence’s Red Teaming Guide, red teaming is ‘the independent application of structured, creative and critical thinking to challenge plans, programmes, ideas and assumptions.’ It’s a systematic way of asking difficult questions before reality does.

Originally developed in military and intelligence settings, red teaming is now widely applied in business strategy, emergency planning, cybersecurity, and government decision-making. The concept is simple but powerful: create a team, separate from the planners or decision makers, whose role is to test, probe, and stress test plans as if they were an adversary or sceptical observer.

Formal red teams can be resource intensive and, if overly bureaucratic, may lack the agility needed for sudden onset incidents. Adopting a red team mindset, structured scepticism, assumption testing, and rapid ‘what if’ challenge, within existing crisis teams delivers much of the same value while remaining nimble enough for urgent, complex problems.

A red team mindset is not there to ‘win’ or humiliate but to make organisational thinking more robust. Done well, it fosters humility in leadership, reduces groupthink, and builds confidence that strategies have been tested under pressure.

The red team’s job is to challenge, not undermine. Psychological safety is vital, and leaders must explicitly welcome critique.

Why We Need Red Teaming

The need for red teaming stems from a fundamental truth about human decision-making: we are not rational creatures. Even the best leaders fall prey to cognitive bias and misinformation. When operating under time pressure, uncertainty, or emotion, all hallmarks of a crisis, these flaws are amplified.

Red teaming provides the structure and discipline to counter these tendencies. It offers a deliberate, safe mechanism for dissent, ensuring plans are tested from every angle before being acted upon.

It forces decision-makers to ask:

  • What if our assumptions are wrong?

  • How might this plan fail?

  • What could an intelligent opponent exploit?

  • What are we not seeing?

The Triple Threat: Groupthink, Misinformation, and Cognitive Bias

To appreciate red teaming’s value, it helps to understand the enemies it seeks to defeat.

1. Groupthink

Groupthink occurs when teams prioritise harmony and consensus over critical analysis. It thrives in environments where dissent feels risky or where authority discourages challenge. The result is collective blindness, a team that appears unified but is actually uncritical. Groupthink is synonymous with cohesive groups facing serious external threats. Red teaming attacks groupthink by legitimising dissent.

During the Cuban Missile Crisis (1962), President John F. Kennedy deliberately encouraged dissent within his advisory team, forming what we might now call a red team to challenge the prevailing military advice for an airstrike on Cuba. By fostering open debate and questioning assumptions, Kennedy’s team avoided groupthink, steering toward a diplomatic resolution that averted nuclear war, a defining example of how critical challenge can prevent catastrophic escalation.

2. Misinformation

In an era of rage farming, deepfakes, and rapid information cycles, misinformation spreads faster than verification. But misinformation isn’t just ‘fake news.’ Inside organisations, it can take subtler forms: optimistic reporting, selective data use, or unverified assumptions passed off as fact.

Red teams counter misinformation by verifying assumptions, tracing information to its source, and deliberately testing what would happen if certain data were wrong.

3. Cognitive Bias

Cognitive biases are mental shortcuts the brain uses to process information quickly, often at the expense of objectivity. They are deviations from rational thinking and can be disastrous in complex decision environments. Confirmation bias, anchoring bias, overconfidence, and sunk cost fallacy are just a few examples.

Adopting a red team mindset helps identify and disrupt these patterns. Red teams might require decision-makers to defend the opposite position, test counterfactuals, or re-evaluate evidence from a blank slate. In doing so, they help organisations replace certainty with curiosity.

How Red Teaming Counters These Threats

1. Structured Dissent

Red teaming makes dissent a structured part of decision-making. By institutionalising challenge, leaders can hear uncomfortable truths early, before they become operational failures.

Techniques such as Devil’s Advocacy, Assumption Surfacing, Alternative Futures Analysis or Cone of Plausibility, allow teams to articulate different perspectives and expose blind spots.

One of the key mantras of a red team mindset is that everyone speaks once before anyone speaks twice, and leaders speak last. This ensures each team member contributes, promoting genuine engagement and enabling constructive challenge early.

2. Information Stress Testing

Red teams examine the validity and origin of information. In misinformation rich environments, they test how plans would perform if key assumptions were false or data compromised.

  • In cybersecurity, this might mean simulating data corruption or phishing attacks.

  • In crisis communications, it could involve injecting false narratives to test the organisation’s ability to identify and correct misinformation under pressure.

3. Bias Awareness

Through structured exercises and post mortems, red teams reveal where decision-makers fall prey to bias. A well designed red team doesn’t just critique the plan it holds a mirror to the thinking behind it.

From Insight to Action: Building a Red Team Culture

A red team isn’t a one off exercise; it’s a mindset. To embed it:

1. Train for Cognitive Diversity
Bring together people who think differently. Red teams thrive on contrast; technical, operational, and cultural diversity all sharpen analysis.

2. Reward Candour
Celebrate those who spot flaws, not just those who deliver good news. Make curiosity and constructive dissent core organisational values.

3. Institutionalise Challenge
Embed red team thinking into crisis management exercises and post incident reviews. Make it part of business as usual, not a special event.

4. Debrief and Share
Use lessons learned to update procedures and training. Create a feedback loop between ‘blue teams’ (planners) and ‘red teams’ (challengers).

5. Protect the Process
Red teaming only works if leadership resists the urge to ‘shoot the messenger.’ Psychological safety and leadership humility are its foundations.

Conclusion

The challenges we face in today’s world of incident response are increasingly complex and multi-faceted. Red teaming, and the mindset that underpins it, is a powerful tool for improving decision quality. It strengthens plans by challenging assumptions, identifying blind spots, and encouraging leaders to think critically under pressure.

At its core, red teaming acknowledges the frailties of human cognition and provides a structured, team based mechanism to overcome them. By doing so, it transforms uncertainty into insight and makes crisis response not just reactive, but resilient.

If you’d like to know more about red teaming, its associated methodologies, and how your organisation can better prepare for crisis events, contact me at derek@prosilienceconsulting.eu or visit www.prosilienceconsulting.eu.

Next
Next

Decision Making in a VUCA World