When does a hosting service lose its immunity? Apple may be liable for illegal loot boxes in a game

Article
BE Law
EU Law

An iPhone user with a gambling problem spent tens of thousands of euros on loot boxes in a game app and sought to hold Apple, as the operator of the App Store, liable for this damage. The Antwerp Enterprise Court[1] ruled that the loot box mechanism qualifies as a (prohibited) game of chance under Belgian law, but the question remains whether Apple can invoke the immunity of a hosting service. Preliminary questions on this matter were referred to the Court of Justice of the European Union (CJEU).

Are loot boxes games of chance?

Loot boxes are described as a collective term for game elements in which the player acquires game items in a seemingly random manner, whether or not in exchange for payment. According to the Belgian Gaming Act, a game of chance requires four elements: (1) a game, (2) a stake, (3) the possibility of winning or losing, and (4) chance (even if incidental).

Apple contested the applicability of each element, but only substantiated its defence for the elements of stake and chance, so that the Antwerp Enterprise Court1 considered the remaining elements to be established.

With regard to the ‘stake’, the court noted that the existence of free loot boxes does not mean that the paying elements therefore fall outside the definition. 

With regard to the element of ‘chance’, the court ruled that indicating the probability of obtaining a loot box does not provide clarity about its specific contents, and that even the slightest form of chance in the course of the game is sufficient. The decisive factor was that the game developer itself indicated that chance plays a role.

The court therefore concluded that loot boxes qualify as games of chance under Belgian law. Under the Gaming Act, it is prohibited to operate a game of chance, facilitate its operation or advertise it without a licence from the Gaming Commission if you know that it is not licensed. Since it was established that this licence had not been obtained, Apple violated this legal standard. This constitutes a wrongful act that may give rise to (full) compensation for the damage suffered.

Does Apple enjoy immunity as a hosting service?

Apple defended itself by arguing that it enjoys immunity under the e-Commerce Directive2 (this Directive has since been replaced by the Digital Services Act3 (‘DSA’)).

The safe harbour provision in this Directive exempts a hosting service from liability for stored information provided by recipients of the service, provided that certain conditions are met (see below).4 

Applicability of safe harbour provisions to gambling services

The first question that arises is whether the safe harbour provisions (immunity) in the e-Commerce Directive are applicable to gambling activities. The e-Commerce Directive excluded gambling activities from its scope, but a statement by the European Commission suggests that the immunity regime in this directive does apply to ‘gambling-related content’. Since the DSA, unlike the e-Commerce Directive, does not provide for an exclusion for gambling activities, this question is now largely of historical relevance. If the Directive applies, the court also questions whether the concept of ‘gambling activities’ should be interpreted in accordance with national law or whether it is an autonomous concept of EU law.

Applicability of safe harbour provisions to software

According to the court, Apple’s App Store is an ‘information society service’, but it is unclear whether the software offered by the App Store falls under the concept of ‘information’. This question will also be referred to the CJEU. 

Apple's role as operator of the App Store

Based on the e-Commerce Directive and EU case law, a hosting service can only invoke the safe harbour provisions if it:

  1. acts passively;
  2. has no knowledge of the illegal activity or information5;
  3. acts expeditiously to remove such content; and
  4. exercises no authority or control over the recipient of the service.

The court ruled that points 1 and 2 are problematic.

With regard to the first point (passive role), the court found that Apple extensively reviews the apps submitted by developers and that the guidelines it imposes reflect its own policy choices. The court asked the CJEU whether Apple’s extensive review means that its role is not merely ‘passive’ and that the software is therefore offered under its supervision.

With regard to the second point (knowledge of illegality), the court found that Apple had sufficient knowledge of the fact that loot boxes are illegal in Belgium in the absence of a licence. However, this knowledge was of a general nature and related to the phenomenon of ‘loot boxes’ as a whole, not to the game in question specifically. The court asks the CJEU whether this general knowledge is sufficient or whether specific knowledge of the individual content is required.

The court noted that the DSA is useful for interpreting the former e-Commerce Directive. From recital 22 of the DSA6, it may be inferred that immunity is only excluded in cases of specific knowledge of illegality, which would argue in Apple’s favour. However, definitive clarity on this issue will need to be provided by the CJEU.

Conclusion

This ruling makes it clear that loot boxes qualify as games of chance under Belgian law. It also raises crucial questions for the CJEU, which could determine the future of platform liability:

  1. Can the immunity of the hosting service apply to gambling activities and how should the term 'gambling activities' be interpreted? (Under the DSA, this exclusion for gambling activities is no longer included).
  2. Does software fall under the concept of ‘information’?
  3. Is general knowledge of the illegality of a category of content sufficient to exclude immunity, or must the service provider have specific knowledge of individual content?
  4. Does the app approval process mean that the recipient of the service acquires these apps under the supervision of the service provider?

Apple will only be able to invoke its immunity if the answer to the first two questions is affirmative and the answer to the last two questions is negative. The CJEU’s answers will not only decide on Apple's liability in this case, but will also guide the interpretation of the DSA and the degree of responsibility that hosting providers bear for the content on their platforms.

  • 1

    Antwerp Enterprise Court, Antwerp Division, 16 January 2025, A/23/04416, unpublished.

  • 2

    Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market.

  • 3

    Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC.

  • 4

    Former Article XII.19 of the Economic Law Code and Article 14 of the e-Commerce Directive, now replaced by Article 6 of the DSA.

  • 5

    In the DSA, ‘illegal activity or information’ was replaced by ‘illegal activity or illegal content’.

  • 6

    “[…] The provider can obtain such actual knowledge or awareness of the illegal nature of the content, inter alia through its own-initiative investigations or through notices submitted to it by individuals or entities in accordance with this Regulation in so far as such notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and, where appropriate, act against the allegedly illegal content. However, such actual knowledge or awareness cannot be considered to be obtained solely on the ground that that provider is aware, in a general sense, of the fact that its service is also used to store illegal content. Furthermore, the fact that the provider automatically indexes information uploaded to its service, that it has a search function or that it recommends information on the basis of the profiles or preferences of the recipients of the service is not a sufficient ground for considering that provider to have ‘specific’ knowledge of illegal activities carried out on that platform or of illegal content stored on it.