rss

The B BLOG

The Bannachra Blog

Scottish Estate NC - Boggy Peat.jpg

Corporate Sustainability and Natural Capital

It can be challenging for an individual to meaningfully comprehend the vastness of the Earth’s natural environment. Humanity, as a species of nearly 8 billion individuals, only accounts for 0.01% of the total biomass of life on the planet, where biomass is a measure of how much carbon is held within each living organism (Ritchie, 2019). Approximately 17% of all biomass produced in 2013 was consumed by humans. Comparatively, in 1900, humans consumed approximately 4% of biomass produced in that year (Gates, 2013).

Each person and corporation impacts and depends on the environment to a greater or lesser extent. 

Corporations are responsible for producing and supplying the majority of humanity’s food, transport, energy, and commercial products. These activities draw more from the natural world in ways that scientists have determined to be fundamentally unsustainable (United Nations Environment Programme [UNEP], 2019:4). 

At some point in the not-too-distant future, the natural world will be unable to continue to supply the resources necessary for corporate activity as it functions today. When this happens, because of corporate dependence on these resources, corporations will struggle to continue to operate and the natural world will struggle to replenish itself independently. It is therefore in the best interests of corporations to reassess and adjust how they engage with the natural world.

How should corporations view their impact and dependence on the natural world by examining the activities in their value chains. To assist corporations in understanding the materiality of these issues for themselves and their responsibility regarding the use of natural resources, then, what should a corporation do with this knowledge?

Two main strategies present themselves:

Mitigate impact on natural resources.
Adapt to become less dependent on natural resource

RRF Phases.png

A (Short) History of Resilience

Resilience is a concept that has evolved over the last 100 years. During this time the UK has gradually moved from civil defence (a focus on war risks, with some spillover benefits) to Integrated Emergency Management (IEM, an all-hazards approach driven by risk assessment) to Resilience (preparedness for effects across networked systems). And in each case, embedding change has taken years.

Modern emergency powers have their roots in the post-WWI desire to be able to tackle any threat to the state, and to recognise broader civil contingency risks beyond the war. In the period running up to WWII, this early concept of emergency planning remained focussed on security, but did include consideration of critical supply chains and risks to national infrastructure. The expansion of this to include civil contingency risks, and to give local responders official responsibilities came just before the outbreak of WWII, and naturally focussed on protecting local communities from the impacts of war.

After the war, civil defence continued to develop to include smaller scale civil crises, including the widespread disruption caused by strikes in the 1970s. But as the risks facing the UK evolved and adapted in the post-war period, so did our approach to tackling them. It would not be until the 1980s that a new concept of IEM emerged, taking a broader risk-based approach to the whole range of hazards that faced the UK as a nation. This was adapted further in the early 2000s into a new Resilience approach, partly driven by the 9/11 attacks.

Now, IEM and Resilience are systems that are employed across the world. IEM forms the basis of work in most developed countries. The UK Government was an outlier when it adopted Resilience in the 2000s, but it is now common practice internationally. But the key deficiency with each of these approaches has been the inability to get ahead of problems - to tackle them at source.

Live Blog Widget