Regulatory

Change on the Horizon?: The Economic and Corporate Transparency Act 2023 and Failure to Prevent Fraud Offences

The Economic and Corporate Transparency Act 2023 (ECCTA) received Royal Assent on 26 October 2023 in the face of a UK “fraud epidemic”. As described under question 9.3 of the ICLG Corporate Investigations 2024 – England and Wales chapter, two key reforms in the ECCTA to address this epidemic were the introduction of a “failure to prevent” (FTP) fraud offence, and reform of the “identification doctrine” for fraud and other economic crimes.

Broadly, the FTP fraud offence will make “large organisations” strictly liable if an “associated person” commits a fraud offence intending to benefit the organisation, unless the organisation could prove that it had “reasonable procedures” designed to prevent the offending.

The reform of the identification doctrine significantly expands the category of persons who could be “identified with” a commercial organisation for the purposes of attributing criminal liability in economic crimes from “directing minds” (usually Board directors) to “senior managers” (so broadly defined as potentially to include department heads, for example).

BCL’s Tom McNeill, Richard Reichman, Michael Drury, and John Binns give their expert opinion on the reforms in ICLG.

This article was first published in ICLG on 15 May 2024. To read the full article, please click here.

Related articles

We Need to Talk About Corporate Criminal Liability

The fines of £10m for Transport for London and £4m for Tram Operations Ltd following the Croydon tram tragedy signal a wrong turn in health and safety sentencing, and corporate criminal liability generally

On 9 November 2016, seven passengers tragically were killed and 19 suffered life-changing injuries at Sandilands junction near Croydon when a commuter tram derailed. The tram was being driven at 73kph into a sharp turn, where the speed limit was 20kph and where any speed over about 50kph would inevitably cause over-turning. The tram driver, Alfred Dorris, told police that he was disorientated and thought he was heading in the opposite direction.

The safety regulator, the Office of Rail and Road, prosecuted Mr Dorris under health and safety law for failing to take ‘reasonable care’. There was a lengthy trial at the Old Bailey and, in June 2023, Mr Dorris was acquitted by the jury. Experts agreed that ‘driver disorientation’ was the most likely cause and that ‘assorted defects and issues within the tunnel including the defective lighting contributed to the driver’s disorientation whilst he was driving the tram.’

After the verdict, some family members of the victims were understandably upset. One commented: ‘If I got into my car and I did what he did at the speed that he did, then I would go to prison’. Fortunately, the organisations responsible for the infrastructure of the tram network (Transport for London) and for running the tram company (Tram Operations Ltd) pleaded guilty to health and safety offences and were to be sentenced the following month. There was a chance for justice yet.

Sentencing

The sentencing judge, then Mr Justice Fraser, now The Honourable Lord Justice Fraser, had been Mr Dorris’ trial judge. The above-quoted comments about ‘driver disorientation’ and the contribution of ‘associated defects and issues within the tunnel’ are in fact taken from Fraser’s sentencing remarks. Fraser was scathing about the ‘complacency’ which allowed such issues to persist, and the missed opportunities to put things right.

Referring to Health and Safety Executive guidance in relation to ‘human error’ and ‘managing risk’, Fraser explained that, although the immediate cause of this disaster was the human failure by Mr Dorris, the underlying causes were the failures of control by TfL and TOL, where ‘Prime responsibility for accident and ill health prevention rests with management’.

Fraser found both organisations ‘highly culpable’, where breaches ‘subsisted over a long period of time’ and where there was a ‘high likelihood of harm’. Fraser also found significant ‘aggravating features’. It was the ‘unacceptably complacent approach to safety generally’ which so offended Fraser, in the face of what he considered an obvious risk of death. That is why he imposed such enormous fines. Not merely to punish (of course, in a fair and proportionate way) those highly culpable organisations, but to deter future breaches, not just by these organisations but other potential offenders who might otherwise not heed the lessons.

The Issue

While Fraser’s approach might seem perfectly sensible, and indeed admirable, unfortunately, when you get into the weeds, it is not. The issue however is not that Fraser was unreasonable (unreasonable judges are commonplace), but that there’s little that the organisations concerned could do about it. There are a few reasons for this (explained briefly below) and a few possible options for organisations keen to avoid a similar fate (again, explained briefly), but at its heart this is a structural issue that not only repeats time and again in health and safety cases but applies more widely to ‘regulatory’ offences of all kinds.

And while the UK rushes to extend corporate criminal liability ever further, in particular by expanding the ‘regulatory’ approach to new spheres, including through the ‘failure to prevent’ offences for economic crimes – and adopts an approach to sentencing corporates that makes little distinction between strict liability offences and offences that require proof of failures by ‘senior management’ – the Croydon derailment warns of what lies ahead: liability for organisations but rarely for individuals; punishments for organisations disproportionate to (senior management) culpability; ‘deterrent’ sentences that are positively Kafkaesque. And if the unfairness, or unnecessary financial burdens on UK businesses, are not sufficiently moving, such an approach may in fact do more harm than good.

Fraser’s Judgment

Prior to Croydon, the last time a tram had overturned in the UK was 1953. There had only been a few cases of trams overturning in the rest of the world. Following the accident, the Rail Accident and Investigation Board, the specialist government agency responsible for independently investigating serious rail accidents to identify causes and make recommendations to improve railway safety, found that the risk of trams overturning due to excessive speed around curves had not been addressed by UK tramway designers, owners, operators, or the safety regulator.

The entire industry, in fact, including (it is worth repeating) the safety regulator, discounted the risk. Many didn’t spot the risk of trams overturning at all. Fraser thought the risk ‘obvious’. It certainly is obvious if the first time you paid any attention to a tram is when it derailed at high speed and killed seven passengers. It may be less obvious to someone who has spent several decades monitoring trams dawdling around corners at 20kph. And that was the essential context for everything. For the lack of understanding, for the failures in systems, for the not doing what Fraser thought should be done.

At the time of the accident, the safety of trams ultimately came down to the drivers. There were various measures in place to ensure the competence of drivers, which were not criticised. Tram drivers were (and still are) expected to drive at a speed which will enable them to stop the tram in the distance that they can see ahead, like the drivers of road vehicles. Driving a tram is relatively safe compared to driving a road vehicle and there were relatively few accidents.

What had not been adequately understood was the risk of driver ‘disorientation’. In particular, driver disorientation occurring during a stretch of the track where the tram reaches high speeds before going into a sharp turn, with the consequent risk of overturning. However unlikely such an event might be, with sufficient tram journeys over sufficient years, eventually, if not spotted in the meantime, that risk is going to result in an accident.

Fraser rightly point to HSE guidance about human error and managing risk. Organisations have extremely onerous duties to ensure safety, including developing systems that recognise and control for human error, so far as is reasonably practicable.

After the accident, the RAIB’s principal recommendation to prevent future accidents was to introduce engineering controls that automatically enforce compliance with signals and obedience to speed limits. Even if a driver became disoriented at just the wrong moment, automatic systems would slow the tram down. However, as no one else in the UK had introduced these engineering controls prior to Croydon – because a tram hadn’t overturned in the UK since 1953 and the entire industry, including the safety regulator, didn’t think such controls necessary – even Fraser accepted that the organisations could not be criticised for their absence.

This is crucial not only because the organisations could not be blamed for not implementing the essential control, but also because the same industry-wide understanding of risk was relevant to other controls which Fraser found the organisations ‘highly culpable’ for not implementing.

Take the example of signage (Fraser’s best point). Experts at trial said that if there had been more and larger signs, the driver would possibly not have become disoriented or would have reoriented himself more quickly. In 2007, a former employee, Mr Snowdon, noting the risk of disorientation, had recommended extra signage and other measures to improve visual cues within the Sandilands tunnel. Elsewhere in the UK, there were a few occasions when an organisation had put in place this kind of additional signage for a high-risk spot. In short, at the time of the accident, the signage at Sandilands could and should have been better.

When assessing culpability, however, it is important to note that the signage complied with the relevant guidance issued by the safety regulator (which also received Mr Snowdon’s warning). And that the guidance did not recommend more because the safety regulator and the industry generally did not appreciate the risk. Mr Snowdon was an outlier. He recommended safety measures beyond what most others thought necessary, and even Mr Snowdon appears not to have identified the risk of derailment. The tragedy proved Mr Snowdon right and almost everybody else wrong. But how culpable was such an error? How much of that culpability should be laid at the doors of the organisations concerned? And what is the deterrence value of punishing it?

Fraser made much of the fact that there had been previous occasions when a driver had gone too fast into the Sandilands curve and needed to brake heavily, and the failures to report and act upon such ‘near misses’: ‘TOL was responsible for managing the drivers, and had a system where they were expected to self-report incidents. Many were reluctant to do so due to the adverse consequences, including disciplinary proceedings, that sometimes followed if they did. The system of self-reporting was not a safe one, and the regulator had suggested some time before the disaster that it be changed.’

In fact, the regulator (yes, the ORR), in the 2010 audit referenced by Fraser, found no evidence that the system of reporting was below the required standards; and while the ORR identified areas to be reviewed and addressed (which areas were reviewed and addressed), the audit also included: ‘Senior management is highly positive about the importance of safety and is active in promoting and including staff in discussions. There is an open culture around safety at the management level. This is recognised and appreciated by staff. // Staff feel that they are encouraged by the company to report issues and that senior management are very open to hearing concerns.’ These findings were confirmed by an independent survey commissioned by TOL in 2016.

What more should TOL have done? Fraser made reference to drivers’ fears of ‘adverse consequences’ should they have self-reported (including a fear of ‘excessive monitoring’), and referred to HSE guidance that includes: ‘Organisations must recognise that they need to consider human factors as a distinct element which must be recognised, assessed and managed effectively in order to control risks’. Fraser however is wading into a highly contested debate amongst health and safety experts about ‘just culture’ – highly contested because there are no easy solutions. Not only in terms of persuading workers to self-report (there’s little evidence of increased reporting), but the adverse consequences of adopting the kind of extreme ‘no blame’ culture that Fraser impliedly advocates. The problem being that it removes personal responsibility from workers and, as many health and safety experts (albeit, not Honourable Lord Justices) point out, has significant downsides when considering the management of safety risks as a whole.

Ultimately, these failures to report by experienced drivers were not ‘human errors’ but ‘deliberate violations’ and clear breaches of the health and safety duties imposed by law on those drivers. The law was framed as it is precisely because those responsible for its drafting understood the importance of individual responsibility in ensuring safety.

And this is where we arrive at the nub. Recognising that organisations have onerous responsibilities to guard against human failure does not mean that a failure to succeed is necessarily culpable, still less ‘highly culpable’. This is because despite all the excellent (albeit contradictory) guidance issued by health and safety experts, no one has yet cured human fallibility. Not least because that same capacity for human error, failures of foresight, and so on, applies to ‘management’ as it does to frontline workers.

Take Fraser’s description of the failure to perform a ‘route risk assessment’ in addition to a ‘route hazard assessment’: ‘This saga, which amounts to the story of almost endless recitation in successive meeting minutes of steps being taken such as videoing the route, the work not being completed, updating the videos, and carrying the item forward to the next meeting, reads as a sorry tale of lack of meaningful progress, with the entry being ultimately marked “closed” without any risk assessment ever being done.’ Fraser’s frustration is easily understood. Less well understood is what organisations can do to prevent such failings entirely.

Fraser accepted that there were effective procedures in place for other aspects of risk: ‘TFL did try to be careful about safety and I accept that from 2008, when it acquired its interest in the network, it worked far more co-operatively and constructively with TOL than its predecessor had. It has no previous convictions and has a good record. There were effective procedures in place for other aspects of risk…’. And: ‘TOL did have committees and structures dealing with safety and I accept that it has no previous convictions and has a good record. There were effective procedures in place for other aspects of risk and an external consultant, QSS, was used to advise it on such matters…’.

In other words, there is no evidence that the organisations were not trying to ensure effective procedures across the business; indeed, it was accepted that they largely succeeded. In these circumstances, and recognising the common failings to address the risk across the industry, what was the purpose of such draconian fines? What was the deterrence value? What are the lessons to be learned? Not the lessons learned with hindsight following the accident which were addressed by the RAIB and the organisations themselves prior to the sentencing hearing, but the lessons that the ‘management’ should have known before the accident? Put another way, what did the sentencing achieve?

A Wrong Turn?

Health and safety law in the UK is founded on the Health and Safety at Work etc. Act 1974, which sought to implement the findings of the Robens Report, the culmination of an extensive review by the Committee on Health and Safety at Work. On its fiftieth anniversary, in July 2022, the principal health and safety regulator, the HSE, stated that the Robens Report has not only stood the test of time, but ‘still matters. And will still matter over the next 50 years.’ This is what the Robens Report says about criminal proceedings:

‘The fact is—and we believe this to be widely recognised—that the traditional concepts of the criminal law are not readily applicable to the majority of infringements which arise under this type of legislation. Relatively few offences are clear-cut, few arise from reckless indifference to the possibility of causing injury, few can be laid without qualification at the door of a particular individual. The typical infringement or combination of infringements arises rather through carelessness, oversight, lack of knowledge or means, inadequate supervision or sheer inefficiency.. In such circumstances the process of prosecution and punishment by the criminal courts is largely an irrelevancy. The real need is for a constructive means of ensuring that practical improvements are made and preventive measures adopted. Whatever the value of the threat of prosecution, the actual process of prosecution makes little direct contribution towards this end. On the contrary, the laborious work of preparing prosecutions —and in the case of the Factory Inspectorate, of actually conducting them— consumes much valuable time which the inspectorates are naturally reluctant to devote to such little purpose…’.

And again: ‘We have said that criminal proceedings are inappropriate for the generality of offences that arise under safety and health at work legislation. We recommend that criminal proceedings should, as a matter of policy, be instituted only for infringements of a type where the imposition of exemplary punishment would be generally expected and supported by the public. We mean by this offences of a flagrant, wilful or reckless nature which either have or could have resulted in serious injury. A corollary of this is that the maximum permissible fines should be considerably increased…’.

Broadly, this was the approach adopted by safety regulators and the courts for a few decades following the HSWA’s enactment. A significant turning point came with British Steel [1995]. British Steel had delegated supervision to a competent supervisor. He failed to perform that task adequately resulting in a fatal accident. The Court of Appeal held that British Steel was responsible in law and, while recognising that it did not have ‘great experience in this field’, decided that imposing large fines even for such ‘technical’ breaches would promote ‘a culture of guarding against the risks’. The upshot is that courts more used to dealing with murders and mortgage frauds began determining organisational culpability in relation to complex and contentious health and safety matters by applying principles of absolute liability, and imposing ever-increasing fines as a ‘deterrence’.

Downsides?

The safety expert Dr Robert Long wrote: ‘It is a popular idea that regulation and punishment drive learning, when in fact they create new complexities and problems which previously didn’t exist.’ Dr Long describes the command and control tendencies of regulatory enforcement: ‘The paradigm goes something like this: 1. Here is the rule 2. Comply with the rule 3. Enforce the rule 4. Punish people who don’t comply 5. Put a cop on every beat and police the rule 6. We’ll catch you and watch out. Result? No ownership, limited maturity and reporting goes ‘underground’. This thinking drives a ‘nanny’ mindset, power operates rather than influence, people don’t feel recognised, people get sick and people leave. It’s a model of non-motivation and most important to note, models non-learning.’

He further comments: ‘It’s a sad state of affairs when we create systems and audits to check on systems and audits. And so, we end up with audit checks to certify inspections and audits, inspections to certify inspections, multiple layers of regulations and authorities to validate procedures and tiers of governance to govern governance.’ And then: ‘Most people believe the purpose of a system is to ‘cover their arse’. What does this cultural belief engender? Humans end up following systems in order to be compliant and then have an unspoken set of micro-rules which they really believe in and follow.’

Professor Dekker: ‘The things that get changed when a failure is met with an “unjust” response…are not typically the things that make the organization safer. It does not typically lead to improvements in primary processes. It can lead to “improvement” of all the stuff that swirls around those primary processes: bureaucracy, involvement of the organization’s legal department, bookkeeping, micro-management. Paradoxically, many such measures can make the work of those at the sharp end, those whose main concern is the primary process, more difficult, lower in quality, more cumbersome, and perhaps even less safe.’

Professor Dekker again: ‘Doubts…exist about the ability of a judiciary to make sense of the messy details of practice in a safety-critical domain, let alone resist common biases of outcome knowledge and hindsight in adjudicating people’s performance.’

Health and safety investigations, and indeed ‘regulatory’ investigations generally, are particularly susceptible to bias because they are frequently more subjective and complex than other criminal investigations. The fundamental question is usually ‘why’ (rather than for example ‘who’); and the ‘why’ concerns the behaviour of organisations and not merely individuals. Common biases include: confirmation bias, where evidence is searched for and interpreted to confirm the existing case theory (for example, that accidents are almost invariably due to management failings); outcome bias, where an evaluation of the systems is coloured by the fact that they did not prevent the harm; and hindsight bias, where knowledge of the outcome causes people to overestimate the likelihood of past events, and to judge failings with bad outcomes as more culpable.

Dekker posits: ‘…as Nietzsche pointed out, few things make us as anxious as not having a cause for things that go wrong. Without a cause, there is nothing to fix. And with nothing to fix, things could go terribly, randomly wrong again – with us on the receiving end next time. Having a criminal justice system deliver us stories that clearly carve out the disordered from order, that excise evil from good, deviant from normal, is about creating some of the order that was lost in the disruption by the bad event. Such narratives reflect, said White, “a desire to have real events display a coherence, integrity, fullness and closure of an image of life that is and can only be imaginary.”’

Dekker concludes: ‘There is no evidence…that the original purposes of a judicial system (such as prevention, retribution, or rehabilitation – not to mention getting a “true” account of what happened or actually serving “justice”) are furthered by criminalizing human error.’

Extending the Regulatory Approach

While regulatory lawyers have sought to persuade courts with varying degrees of success that it is beyond the limit of organisations to cure the human condition, it has been curious to see swathes of white-collar crime lawyers advocate for the regulatory approach to be imported into economic crimes via the ‘failure to prevent’ model.

While not based specifically on health and safety law, the FTP model is to the effect that, if established that an ‘associated person’ commits (for example) a fraud offence, and certain other conditions are met, then a (large) commercial organisation would commit an offence subject to a defence of reasonable procedures designed to prevent the offending. This turns a dishonesty offence (once fraud is established) into one of systems failure (like health and safety). With the burden (like health and safety) on the organisation to prove the reasonableness of its procedures to avoid the commission of an otherwise strict liability offence.

Once offending is established, for similar reasons to those discussed above, even the most conscientious organisations will find it difficult to persuade prosecuting authorities and the courts that it was not reasonable for them to do more. Organisations that worked conscientiously to implement measures may properly argue that no person or organisation in history has been able to prevent people from behaving dishonestly for money; and that, for evolutionary reasons, people are predisposed to believing one another, especially those who look and sound the part (the SEC’s investigation of Bernie Madoff, anyone?). While there is a little more scope for such arguments under the FTP model than health and safety law, as with health and safety cases, it will ultimately be necessary to persuade courts that it is beyond the limit even of commercial organisations to cure the human condition.

In the event of a plea or deferred prosecution agreement, however, the scope for such arguments will be even more limited. Wise judges will point to corporate ‘culture’ and conclude that ‘management’ only had to try harder, or spot those red flags which with hindsight are all too obvious. Take the Airbus DPA for FTP bribery offences. Airbus had in place externally certified bribery prevention procedures; individuals deliberately circumvented the systems, including by the creation of false invoices, payments and other compliance material; and when Airbus identified weaknesses, the systems were reviewed and updated and payments frozen. Under the DPA, Airbus was penalised €991 million in the UK as part of a €3.6 billion global resolution. Incidentally, no individuals were prosecuted.

In fact, in a world of chronic under-resourcing of law enforcement, the FTP bribery offence has seen a preoccupation with securing outcomes against organisations to the detriment of cases against those individuals suspected of committing the underlying crime. There has been an astoundingly high proportion of non-prosecutions and failed prosecutions of individuals following deferred prosecution agreements and convictions of organisations for FTP bribery. There are complex reasons for this, but the risk is plain. Punishing organisations for wrongdoing which they may be able to do little to prevent, while the individuals who allegedly perpetrate the wrongdoing walk away free, will do little to deter crime, and may in fact achieve the opposite.

What To Do?

If you’re the MD of ABC Limited surveying this landscape, what should you do about it? Well, there’s little prospect of Option 1: persuading Parliament to change course. The last 20 years of legislative reform has been in one direction, and we have an impending Labour government. For good measure, The Honourable Lord Justice Fraser was recently appointed Chair of the Law Commission.

This appointment makes Option 2 all the more difficult: persuading Lord Justice Fraser to take a secondment at ABC Limited and spot those risks which your organisation and entire industry has missed for the past 65 years. Because if you imagine that your safety, or ABC, or compliance teams, sitting in your offices in Slough, have cured the human condition, think again. It would take Lord Justice Fraser, or Peter the Great, to achieve such a miracle. (On reflection, it may only be Fraser who could pull it off: Peter the Great couldn’t even persuade his wife’s secretary to stop taking bribes, under threat of having his head chopped off.)

Option 3 is do what you can and hope for the best. And should the worst happen, should there be a serious accident, or you sell food past its sell-by date, or pollute the nearest river, or carelessly hire someone who turns out to be a crook, then you could behave ‘commercially’ by cooperating fully, admitting any wrongdoing at the first opportunity, promise to cure the human condition going forwards, and seek the quickest possible resolution for reasons of ‘certainty’. Should it end before a sentencing judge, some short mitigation and take it on the chin.

In the alternative, you could defend yourself. Sadly, Option 4 may in practice mean the same as Option 3, if the facts are against you. Or it could go the other way and end in a contested trial. So, perhaps self-reporting suspected wrongdoing, perhaps not; perhaps being proactive in seeking a non-prosecution, or a civil penalty, or deferred prosecution, perhaps not; perhaps arguing that there is a limit to what even commercial organisations can do to cure the human condition, perhaps not; perhaps arguing that the prosecution must fail for some other reason (while keen advocates of the FTP model, law enforcement has some difficulties managing systems of their own), perhaps not.

Ultimately, if behaving like a ‘responsible’ organisation will result in financial and reputational harm grossly disproportionate to any wrongdoing, and indeed the outcome will be about as bad as it gets, you should at least consider the alternatives.

Related articles

Can understanding insider risk help to prevent fraud?

BCL partners Anoushka Warlow and Tom McNeill team up with our friends Sarah Keeling, Julia Arbery, and Lucy Cryan at StoneTurn to explore the details of the proposed new corporate criminal offence – Failure to Prevent Fraud and Money Laundering, and explain what the proposed changes are and the actions organisations can take now to help protect themselves.

Read more

Embracing the ‘Art of the Possible’ in Novel Food Regulation – David Hardstaff & John Binns write for Food Navigator

BCL partners David Hardstaff and John Binns have written for Food Navigator discussing how a high-profile investigation into banned muscle-building drugs highlights the issues with the novel food regime impacting the CBD industry.

Read more
1 2 3 11