Continue reading on DataGuidance with:
Free Member
Limited ArticlesCreate an account to continue accessing select articles, resources, and guidance notes.
Already have an account? Log in
International: Unpacking self-preferencing in digital markets - part one
The discussion on competition law in the 21st century is incomplete without deliberating on the aspect of self-preferencing. Big Tech companies have secured dominant positions not just in a single market but across markets and have significant control over data, possessing a competitive advantage, resulting in leveraging data collected in one market to penetrate into another market. Upon penetration, various strategies are adopted including self-preferencing to eliminate competition. In order to regulate digital markets, there arose a need to take the initiative to revise/amend the regulations.
In Part 1 of this Insight series, Prashanth Shivadass, Partner at Shivadass & Shivadass (Law Chambers), delves into the concept of self-preferencing in an algorithm-driven economy and in Part 2, analyses the role of data protection laws in tackling the data advantage possessed by Big Techs.
Self-preferencing
Self-preferencing is a practice or a behavior wherein intermediaries selling goods or services from a large number of providers favor the intermediary's own downstream products. Broad categories of self-preferencing behavior include manipulation of search results, privileged access to data, preferential treatment to customers using favored products, dual-faced platform operator charging third-party sellers for distribution, tying and bundling products pertaining to adjacent markets, and preferential access to the platform features. Though the concept of self-preferencing is not something specific to digital markets, the growing concern about the practice is due to large-scale operations of digital platforms. Self-preferencing per se, cannot be considered as anti-competitive in nature. However, when the conduct eliminates downstream competitors and entrenches the position of the platform, it becomes anti-competitive in nature. Hence, there is a need to regulate the practice of self-preferencing in the digital economy.
Algorithm-driven economy and related case studies
Algorithms are the bedrock of the digital economy. Algorithms have a broad spectrum of applications and algorithm-heavy environments impact the effective operation of competition. Algorithmic theories of harm are categorized into three groups which include algorithmic manipulation and collusion through self-preferencing, algorithmic exclusionary conduct, and exploitative conduct. Platforms may manipulate the algorithms to favor their own products and boost rankings for their own products.
E-commerce giant's add-to-cart algorithm
One of the leading E-commerce giants had adopted self-preferencing practices by manipulating search algorithms. The add-to-cart algorithm has been the center of various antitrust investigations. It was alleged to be biased as it favored the E-commerce giant's own products and those of the sellers who used its fulfillment services. The case never proceeded to a final verdict as the commitments offered by the E-commerce giant were accepted. Though a verdict in the said case would have been more welcoming, the E-commerce giant's commitment is a great step toward corporate governance.
Search engine's AdTech
A leading search engine was accused of favoring its own online advertising services in the EU due to massive data collection, resulting in privileged access to data. This is not only a competition law concern but also a concern for data privacy. The search engine's data-related practices continue to be under probe. It is also facing class action litigation in the Netherlands for violation of privacy laws. The search engine has been accused of collecting/profiling users' behavior and sharing the same with online advertising platforms. Another Dutch class action suit has also been filed against the search engine for collecting the data of Android users between 2022 and 2024. It has also been accused of installing a special application that is not visible as an icon on the device and, therefore, is unrecognizable by the user and cannot be uninstalled. The said application is used by the search engine to track the usage of all other applications. Moreover, the legality of the search engine's massive processing of consumer location data was also under probe. It was alleged that the search engine had kept consumers under constant commercial surveillance. The search engine agreed to a record $391.5 million privacy settlement for misleading consumers to think that they had turned off the location when it continued to collect location data. This case is a classic example of how access to large amounts of data is utilized to monopolize market position.
Premier handset's update
One of the leading mobile manufacturer's operating system updates is a prominent example of the violation of data protection and competition law. The update prohibited third-party applications from using default opt-in features which enabled them to track user data. On the other hand, the manufacturer's own applications had an in-built default opt-in feature. This gave the manufacturer privileged access to user data. While the said update was advertised as a privacy-preserving policy, it prevented third-party applications from competing on an equal footing. The same is nothing but an anti-competitive strategy in the guise of a privacy-preserving policy.
Ride-hailing services' algorithmic exclusionary conduct
One ride-hailing services program is a classic example of algorithmic exclusionary conduct to favor one's own services. The union of algorithms and Big Data can help companies in excluding inputs to competitors, thus forcing them to exit the market. With the help of Big Data and algorithms, one of the leading ride-hailing services applications identified and targeted drivers who also drove for competitors. More rides with higher pay were diverted to drivers who drove for competitors, and special bonuses were awarded to those who crossed a targeted number of rides in a week. Data of drivers who drove for competitors in a particular area was collected by allegedly creating fake accounts in the competitor's platform. This data was combined with the data of drivers offering services via the ride-hailing application in the same area to identify the drivers who drove for the company as well as the competitor. An analysis of this case would reveal that the company not only violated an anti-competitive practice but also violated the privacy of the drivers. The arguments advanced before the District Court Judge in California were that the drivers had consented to sharing their location data when they used the company's platform, and there was no proper allegation of money/property loss to invoke provisions of competition laws. The Judge dismissed the suit.
Self-preferencing and data use-purpose limitation
The strategies discussed in the case studies, along with massive data collection, is enabling digital platforms to build their own brand, thus impacting both competition law and data protection laws. The ride-hailing application's strategy is a case involving violation of privacy laws as well as competition law. The purpose of consent to share their location by drivers was for a different activity. However, the same was used as a defense resulting in the dismissal of the suit. Such cases have given rise to the notion that privacy laws kill competition in digital markets. However, privacy laws cannot be considered a defense for competition law. Rather, data protection laws can be used as an effective tool to curb the practice of self-preferencing. Data collected by tech giants to provide platform services is used by them to enter into another market. This is nothing but a sheer violation of the principle of purpose limitation.
Prashanth Shivadass Partner
[email protected]
Shivadass Shivadass (Law Chambers), Bangalore