Third, the task of a platform is not to second-guess whether the authorities would prosecute, but to decide whether it has reasonable grounds to infer that the content falls within the letter of the law. Whilst the Bill makes numerous references to proportionality, that does not affect the basis on which the platform must determine illegality. That is a binary, yes or no assessment. There is no obvious room for a platform to conclude that something is only a little bit illegal, or to decide that, once detected, some content crossing the ‘reasonable grounds to infer’ threshold could be left up. Certainly the political expectation is that any detected illegal content will be removed.
If that is right, the assessment that platforms are required to make under the Bill lacks the anything akin to the ameliorating effect of prosecutorial discretion on the rough edges of the criminal law. Conversely to build such discretion, even principles-based, into the decision-making required of platforms would hardly be a solution either, especially not at the scale and speed implied by automated proactive detection and removal obligations. We do not want platforms to be arbiters of truth, but to ask them (or their automated systems) to be judges of the public interest or of the seriousness of offending would be a recipe for guesswork and arbitrariness, even under the guidance of Ofcom.
If this seems like a double bind, it is. It reflects a fundamental flaw in the Bill’s duty of care approach: the criminal law was designed to be operated within the context of the procedural protections provided by the legal system, and to be adjudged by courts on established facts after due deliberation; not to be the subject of summary justice dispensed on the basis of incomplete information by platforms and their automated systems tasked with undertaking proactive detection.
Fourth, we shall see that in some cases the task required of the platform appears to involve projection into the future on hypothetical facts. Courts are loath to assess future criminal illegality on a hypothetical basis. Their task at trial is to determine whether the events that are proved in fact to have occurred amounted to an offence.
Fifth, inaccuracy. False positives are inevitable with any moderation system – all the more so if automated filtering systems are deployed and are required to act on incomplete information (albeit Ofcom is constrained to some extent by considerations of accuracy, effectiveness and lack of bias in its ability to recommend proactive technology in its Codes of Practice). Moreover, since the dividing line drawn by the Bill is not actual illegality but reasonable grounds to infer illegality, the Bill necessarily deems some false positives to be true positives.
Sixth, the involvement of Ofcom. The platform would have the assistance of a Code of Practice issued by Ofcom. That would no doubt include a section describing the law on encouragement and assistance in the context of the S.24 1971 Act illegal entry offences, and would attempt to draw some lines to guide the platform’s decisions about whether it had reasonable grounds to infer illegality.
An Ofcom Code of Practice would carry substantial legal and practical weight. That is because the Bill provides that taking the measures recommended in a Code of Practice is deemed to fulfil the platform’s duties under the Bill. Much would therefore rest on Ofcom’s view of the law of encouragement and assistance and what would constitute reasonable grounds to draw an inference of illegality in various factual scenarios.
Seventh, the involvement of the Secretary of State. Ofcom might consider whether to adopt the Secretary of State’s ‘in a positive light’ interpretation. As the Bill currently stands, if the Secretary of State did not approve of Ofcom’s recommendation for public policy reasons s/he could send the draft Code of Practice back to Ofcom to with a direction to modify – and, it seems, keep on doing so until s/he was happy with its contents.
Even if that controversial power of direction were removed from the Bill, Ofcom would still have significant day to day power to adopt interpretations of the law and apply them to platforms’ decision-making (albeit Ofcom’s interpretations would in principle be open to challenge by judicial review).
As against those seven points, in fulfilling its duties under the Bill a platform is required to have particular regard to the importance of protecting users’ right to freedom of expression within the law. ‘Within the law’ might suggest that the duty has minimal relevance to the illegality duties, especially when clause 170 sets out expressly how platforms are to determine illegality. It provides that if the reasonable grounds to infer test is satisfied, the platform must treat the content as illegal.
The government’s ECHR Memorandum suggests that the ‘have particular regard’ duty may have some effect on illegality determination, but it does not explain how it does so in the face of the express provisions of clause 170. It also inaccurately paraphrases clause 18 by omitting ‘within the law’:
“34. Under clause 18, all in-scope service providers are required to have regard to the importance of protecting freedom of expression when deciding on and implementing their safety policies and procedures. This will include assessments as to whether content is illegal or of a certain type and how to fulfil its duties in relation to such content. Clause 170 makes clear that providers are not required to treat content as illegal content (i.e. to remove it from their service) unless they have reasonable grounds to infer that all elements of a relevant offence are made out. They must make that inference on the basis of all relevant information reasonably available to them.”
That is all by way of lengthy preliminary. Now let us delve into how a platform might be required to go about assessing the legality of a Channel dinghy video under the Accessories and Abettors Act 1861, then for the companion encouragement and assistance offences under the Serious Crime Act 2007.
Let us assume that the Secretary of State is right: that posting a video of people crossing the Channel in dinghies, which shows that activity in a positive light, can in principle amount to encouraging an illegal entry offence. In the interests of simplicity, I will ignore the Secretary of State’s reference to conspiracy. How should a platform go about determining illegality?
Spoiler alert: the process is more complicated and difficult than the Secretary of State’s pronouncement might suggest. And in case anyone is inclined to charge me with excessive legal pedantry, let us not forget that the task that the Bill expressly requires a platform to undertake is to apply the rules laid down in the Bill and in the relevant underlying offences. The task is not to take a rough and ready ‘that looks a bit dodgy, take it down’, or ‘the Home Secretary has complained about this content so we’d better remove it’ approach. Whether what the Bill requires is at all realistic is another matter.
Aiding, abetting and counselling – the 1861 Act
Aiding, abetting and counselling (the words used by the Secretary of State) is the language of the 1861 Act: “Whosever shall aid, abet, counsel or procure the commission of any indictable offence … shall be liable to be tried, indicted and punished as a principal offender.”
One of the most significant features of accessory liability under the 1861 Act is that there can be no liability for aiding, abetting, counselling or procuring unless and until the principal offence has actually occurred. Whilst the aiding, abetting etc does not have to cause the principal offence that occurred, there has to be some connecting link with it. As Toulson LJ put it in Stringer:
“Whereas the provision of assistance need not involve communication between D and P, encouragement by its nature involves some form of transmission of the encouragement by words or conduct, whether directly or via an intermediary. An un-posted letter of encouragement would not be encouragement unless P chanced to discover it and read it. Similarly, it would be unreal to regard P as acting with the assistance or encouragement of D if the only encouragement took the form of words spoken by D out of P’s earshot.”
Timing This gives rise to a timing problem for a platform tasked with assessing whether a video is illegal. For illegality to arise under the 1861 Act the video must in fact have been viewed by someone contemplating an illegal entry offence, the video would have to have encouraged them to enter the UK illegally, and they would have to have proceeded to do so (or attempt to do so).
Absent those factual events having taken place, there can be no offence of aiding and abetting. The aiding and abetting offence would further require the person posting the video to have intended the person contemplating illegal entry to view the video and to have intended to encourage their actual subsequent actual or attempted illegal entry.
Thus if a platform is assessing a video that is present on the platform, in order to adjudge the video to be illegal it would at a minimum have to consider how long it has been present on the platform. That is because there must be reasonable grounds to infer both that a prospective migrant has viewed it and that since doing so that person has already either entered the UK illegally or attempted to do so. Otherwise no principal offence has yet occurred and so no offence of aiding and abetting the principal offence can have been committed by posting the video.
It may in any case be a nice question whether, in the absence of any evidence available to the platform that a prospective migrant has in fact viewed the video, the platform would have reasonable grounds to infer the existence of any of these facts. To do so would appear to involve making an assumption of someone viewing the video and of a connected illegal entry offence that the assumed viewing has in fact encouraged.
For a post blocked by filtering at the point of upload (if that were considered feasible) the timing issue becomes a conundrum. Since no-one can have viewed a blocked video, none of the required subsequent events can possibly have occurred. Nor does the law provide any offence of attempting to aid and abet a 1971 Act offence.
Thus at least for upload filtering it appears that either there is a conceptual bar to a platform determining that a video blocked at the point of upload amounts to aiding abetting, or the platform would (if the Bill permits it) have to engage in some legal time travel and assess illegality on a hypothetical future basis.
A basis on which a platform could be required to assess such hypothetical illegality may be provided by Clause 53(14)(b) of the Bill, which in effect provides that illegal content includes content that would be illegal if it were present on the platform.