1. Introduction
With the widespread application of algorithmic technology in the economy and society, from "Amazon's differentiated pricing for new and old users" to "Meituan members being exploited", big data "price discrimination" is becoming more widespread. However, the academic community has not yet defined the nature of big data "price discrimination", and the relevant system is in urgent need of improvement. This practice not only harms consumers' rights and interests and the order of market competition, but also poses a severe test to the existing legal regulatory system. This article intends to start with the connotation of big data "price discrimination", clarify the legal characterization of "big data price discrimination", and examine its actual predicament from the three aspects of legislation, law enforcement and judiciary, and attempt to improve its regulatory path, so as to safeguard consumers' rights and interests and promote the healthy and orderly development of the digital economy.
2. The connotation of big data price discrimination behavior
2.1. Definition of big data price discrimination
From an economic perspective, the practice of big data "price discrimination" can be classified as first-degree price discrimination. That is, based on their knowledge of consumers' reservation prices, operators set prices at the highest amount each consumer is willing to pay for a product or service, thereby capturing all consumer surplus.
But big data "price discrimination" is not a proper term in economics, but rather an idiomatic term that has gradually formed in a specific context of The Times. The author believes that big data "price discrimination" refers to the behavior of e-commerce platforms collecting and analyzing consumer data (mainly including price tolerance, payment ability, choice preference, family composition, page stay time, etc.) extensively through algorithms, constructing user profiles and implementing differentiated pricing for homogeneous goods or services to maximize profits.
2.2. Characteristics of big data price gouging
One is concealment. Algorithmic pricing decisions are often opaque, making it difficult for consumers to detect big data price discrimination by operators. The transaction process is closed online, and there is an information asymmetry between consumers and operators. It is difficult for consumers to know whether the price conditions given to them are uniform or personalized if they do not deliberately look for other consumers to compare prices. In addition, the platform has created numerous shopping festivals for promotions, with a variety of holiday discounts and coupons emerging one after another, making it more difficult for consumers to identify big data "price gouging".
The second is technicality. Based on their mastery of user data and algorithmic technology, platform operators use "ABCCDE" to integrate and reorganize user data, which is highly technical. For example, the process of mining users' personal information involves the deep integration of big data and algorithms. The key steps in this process are the establishment of the user's personal information database and the preprocessing of the information, which involves tasks such as data reading, data downloading, data analysis, data filtering, and encryption processing, all of which require complex algorithmic technology as support.
The third is the bidirectional damage. On the one hand, big data "price gouging" infringes upon consumers' right to know right to fair trade, right to choose freely. First, any information that may influence consumers' purchasing decisions should be informed by operators. Big data price discrimination actually conceals the true price of goods or services, infringing upon consumers' right to know. Second, big data price discrimination makes consumers believe they have made equal transactions with the merchant distorting their true intentions in the transaction. This behavior of treating people, not goods, and different prices for the same goods is contrary to the word "fair"and infringes upon consumers' right to fair transactions. Third, personalized pricing limits the space for consumers to compare and select goods or services, confining them to the "information cocoon" woven by the operator and depriving them of the freedom of choice. On the other hand, big data price discrimination distorts the order of the market economy. Big data price discrimination destroys basic trust among market entities, which in turn leads to a decline in market transaction efficiency and chaos in transaction order. Platforms that carry out big data price discrimination have an exclusionatory and exploitative effect on other market operators, which is not conducive to the emergence of truly valuable products and creates a vicious competitive situation of "bad money driving out good money". Enterprises can gain large profits without improving product quality, which dampens the enthusiasm of enterprises to innovate productsand increases the entry costs for new market players.
2.3. The technical implementation path of big data price discrimination
The first stage is the data collection stage. Operators take advantage of the traffic aggregation advantage of the platform port to collect the maximum amount of user data without the user's knowledge and share the user data through database "collision". This is a prerequisite for the implementation of big data "price discrimination".
The second stage is the user profiling stage, where operators use Zhou Xiang's vast database and increasingly advanced algorithms to create a precise profile of each user, calculate the type and brand of goods or services they might purchase and the maximum price they can accept, and push personalized goods to users to achieve "precision marketing". This is a key step in big data "price discrimination".
The third stage is the differential pricing stage, where operators implement differentiated pricing based on data collection and analysis and algorithm application, so that different consumers pay different prices for the same goods or services, maximizing transactions and profits [1].Once users passively choose certain ways of price gouging on the Internet platform, they will constantly reinforce their consumption inertia based on their trust in the Internet platform and become less sensitive to fluctuations in the prices of goods and services, that is, they will no longer repeatedly compare the prices of goods and services of different merchants. Such consumers are also the most vulnerable to round after round of price gouging on the Internet platform.
3. The legal characterization of the "big data price discrimination" behavior
Clarifying the legal characterization of big data price discrimination is a prerequisite for its effective regulation, but there is no consensus in the academic circle at present.
One is the price discrimination theory. Most scholars believe that the essence of big data "price discrimination" is price discrimination, that is, operators charge different prices to different users when selling the same product based on their differentiated purchasing power and different purchasing intentions. If a platform has a dominant market position, it may impose unreasonable price differences on different user groups through algorithms, which constitutes "differential treatment" as stipulated in Article 22, Paragraph 6 of the Anti-Monopoly Law. Article 9 of the Anti-Monopoly Law explicitly prohibits operators from using data and algorithms to carry out monopolistic practices, providing a direct basis for regulating "price discrimination" in the platform economy. Some scholars believe that in the era of big data, precise marketing practices carried out by operators without a dominant market position using information superiority to analyze consumers' price tolerance, consumption preferences, consumption habits, etc. have a significantly greater adverse impact on the market economic order than a favorable impact on business efficiency and consumer welfare [2].In this regard, it is necessary to appropriately break through the limitation of the dominant market position of the actors and expand the extension of the subject of price discrimination under the Anti-Monopoly Law. This view is not adopted in the Anti-Monopoly Law.
But the doctrine ignores the difference between price discrimination in economics and that prescribed by law. The purpose of big data price discrimination is to demand the highest price from each consumer, and it is only because of the different reserved prices of each consumer that the appearance of "thousands of people, thousands of prices" emerges. If the appearance is taken as the essence, it will lead to an unsolvable legal consequence: price discrimination will only be prohibited by law if it causes damage to competition, and the victims of price discrimination are usually the end consumers [3].
Another is the price fraud theory. Big data "price discrimination" is based on the economic theory of price discrimination and the algorithmic pricing decision of "a thousand people, a thousand prices". On the surface, it is a biased pricing behavior carried out by the operator in violation of the obligation to clearly mark the price. In essence, it is a misleading behavior of the operator to the consumer. In terms of the hidden price increase of the operator against the true will of the consumer, It can be regarded as a price fraud carried out by abusing information superiority. It constitutes price fraud in the legal sense when consumers have direct intent to defraud subjectively, objectively conceal price differences and result in consumers being misled. In the case of Ms. Hu v. Ctrip, Ctrip advertised a 15% discount for diamond members, but the actual price was 137% higher than the hotel's market rate. The court found that it had committed price fraud by fabricating the discount price, which reflected this logic [4].
The third is the theory of algorithmic abuse. The abuse of algorithmic power is the essence and root cause of big data "price discrimination". E-commerce operators who hold algorithmic power use pricing algorithms to label highly price-sensitive users as "exploishable" and set differentiated prices. Although the abuse of algorithmic power is a key move in big data price discrimination, it still cannot comprehensively reflect the full picture of the illegality of big data price discrimination. The theory is not comprehensive enough and neglects the protection of consumer rights and the order of market competition.
To sum up, big data price discrimination has both the form of price discrimination and the essence of price fraud and should be classified as "new type of price fraud", and monopolistic platforms should be included in the anti-monopoly regulatory framework for more forceful regulation.
4. The regulatory dilemma of big data price discrimination
4.1. Legislative dilemmas of big data price discrimination
In the current legal system of our country, there are several laws concerning big data price discrimination, but they lack targeted regulations, the concept is ambiguous and subjective, and it is difficult for competing and cooperating legal provisions to cooperate with each other, and it is difficult to produce social effect. The subject of differential treatment regulated by the Anti-Monopoly Law is the platform with a dominant market position, and the price discrimination behavior of small and medium-sized enterprises is difficult to cover, and the market boundaries of the platform economy are ambiguous, making it very difficult to determine a dominant market position. The E-commerce Law requires platforms to provide non-personalized options, but it does not clearly define the nature of big data "price discrimination" behavior, cannot cover all types of "price discrimination", nor does it lower the basic proof standards that the claimant should fulfill. The Consumer Rights and Interests Protection Law lowers the regulatory threshold, but the relief effect on users' rights and interests is limited, and the allocation of burden of proof and compensation standards still need to be refined.
In addition, the iteration of technology has created legal gaps, such as the quantification standard of "unreasonableness" in differential treatment, the "justification" for price discrimination, and "equal trading conditions", which are not detailed, and the specific matters they contain are not clear. The ambiguity of the legal provisions has increased judicial difficulty and hindered the effective regulation of big data "price discrimination".
4.2. Enforcement dilemmas of big data price discrimination
In addition to legislative incompleteness, there are enforcement difficulties in regulating big data price discrimination.
4.2.1. Insufficient regulatory penetration
Due to information asymmetry, big data "price discrimination" is highly concealed under complex discount rules and relatively private payment processes, making it difficult to detect, prove and determine, thus creating regulatory difficulties.
First,algorithms have a high degree of opacity. Objectively, the neural network technology in data-driven algorithms makes it difficult for users to know the learning process due to its nonlinear complexity, the complexity of the training optimization process, and the large parameter space. Subjectively, some platform companies pre-embed behavioral orientation and profit pursuit into rule-driven algorithms and refuse to disclose pricing logic on the grounds of trade secrets, algorithm security, etc., artificially constructing "algorithm black boxes", making it difficult for regulatory authorities to identify and respond to potential risks brought by algorithms in a timely manner. The data reporting obligations [5] proposed by some scholars can only be imposed on "gatekeepers" and cannot prevent other enterprises from engaging in big data "price discrimination" [6].
Second, regulatory authorities lack the ability to audit algorithms, making it difficult for them to verify the fairness of pricing models. Due to the relatively backward technical equipment for regulation and the shortage of relevant technical personnel, they cannot meet the needs of law enforcement. In addition, big data and algorithmic technologies are constantly evolving, making it difficult for regulators to predict and prevent potential market dominance from a technical perspective, and to accurately identify and control platform "price gouging" behavior.
Third, changes in market demand lead to real-time price fluctuations, which are ephemeral, and the real-time nature of dynamic pricing makes it difficult for regulators to capture real-time pricing data.
4.2.2. The challenge of cross-domain regulation
When regulating the abuse of algorithmic power on platforms, multiple departments are involved in jurisdiction and cross-regional collaboration, regulatory authorities are scattered, and there is insufficient coordination among departments, which can easily lead to a "regulatory vacuum". In addition, due to the different enforcement bases of various departments, there may be a wide range of penalties and intensities for the same big data "price discrimination" behavior, which may lead to the embarrassment of multiple departments enforcing or being passive and slack.
4.2.3. The regulatory principles are incomplete
The current regulations on big data price discrimination are characterized by a one-size-fits-all approach, responding to the public's urgent need to solve the problem of big data price discrimination and stabilizing the market order in the short term. But differentiated and personalized pricing is a means of market resource allocation, which to some extent can promote competition and increase consumer welfare, and does not necessarily violate the fairness and impartiality of the market mechanism. The core of legal regulation is to prevent essential discrimination based on user characteristics, rather than to deny the market's flexible pricing mechanism. The research found that when the cost of user transfer is below the threshold (about 15%), allowing "tailor-made pricing" can increase corporate profits by 12% and consumer surplus by 7%.
4.3. The judicial dilemma of big data price gouging
In addition to the absence of legislation and law enforcement, it is also difficult for consumers to protect their rights through judicial means.
4.3.1. The absence of the public interest litigation system
In the context of big data "price discrimination", the number of victims is often large, and the cost of individual rights protection is high. Therefore, public interest litigation is not only more convenient but also helps to enhance the social influence of the case. However, the qualified plaintiffs stipulated in the Consumer Rights Protection Law are limited to the China Consumers Association and provincial consumer associations, and the class action litigation mechanism is not sound. Civil public interest litigation can only be initiated by procuratorates at or above the municipal level in the community when the operator is found to have monopolistic practices, which is too narrow to regulate the abuse of algorithmic power.
4.3.2. It is difficult for consumers to provide evidence
According to the current rules of proof in civil litigation, consumers need to prove that the platform has differential pricing. However, as the data is in the hands of the platform, the threshold for obtaining and fixing evidence is relatively high. In cases such as "Zheng Yugao v. Shanghai Ctrip Business Co., LTD. Tort Liability Dispute case" and "Liu Quan v. Beijing SAN Kuai Technology Co., LTD. Tort Liability Dispute Case", the plaintiffs lost the cases either because they failed to provide evidence or the evidence was insufficient. Operators usually defend themselves on grounds such as the model or configuration of the goods, the enjoyment of package discounts, and different time points, without disclosing specific algorithms, rules and data. Consumers find it difficult to distinguish the legitimate reasons for defense from big data "price discrimination", and it is difficult to determine the fault of platform operators. In small consumer disputes, the cost of consumer rights protection is often higher than the loss, leading most consumers to choose to give up their rights protection.
4.3.3. The standard of damages is ambiguous
Consumers often find out they have been overcharged by comparing prices with other consumers or by changing their login devices, and the "thousands of faces" price makes it difficult to determine the normal price. Even if a standard price can be determined, the difference between the highest and lowest prices accepted for the same goods or services cannot be regarded as a loss suffered by the consumer, and there is no way to talk about the subsequent damages system. The current law does not specify the calculation of compensation for the act of "price gouging", and the application rate of punitive damages is low.
5. Legal regulatory pathways for big data price discrimination
Challenges such as excessive competition and cooperation in legal provisions, insufficient regulatory penetration, and an imperfect judicial relief mechanism have made it difficult to regulate big data price discrimination. The legal regulation of big data "price discrimination" should follow the principles of balancing technological development with rights protection, combining internal supervision with external regulation, and emphasizing both pre-event supervision and post-event accountability.
5.1. Legislative level
5.1.1. Clarify legal definitions
Introduce specific judicial interpretations to define the concept and constituent elements of big data price discrimination, refine quantitative standards for unreasonable differential treatment in light of factors such as transaction scenarios, price differences, and market competition conditions, and improve the legal basis for regulating big data price discrimination. To provide necessary legal guarantees for the government's unified supervision of maintaining market order and for industry associations' self-regulatory supervision of the pricing behavior of business entities. At the same time, the Supreme People's Court may collate and publish some guiding cases for reference by all parties.
5.1.2. Establish rigger rules for the application of law
In the face of the competition of legal provisions, the power boundaries of each law enforcement department should be defined under the guidance of the principle of no duplicate evaluation and full evaluation [7].The Anti-Monopoly Law shall be given priority to platforms with a dominant market position; The Consumer Rights and Interests Protection Law shall be given priority for platforms without a dominant market position that carry out big data "price discrimination" practices that infringe upon consumers' right to know, right to choose, right to fair trade, and disrupt the fair trading order of the market. If none of the above conditions are met, the Price Law shall apply.
The determination of the subject of differential treatment should not be limited to the traditional method based on market share. Instead, it should start from the platform's actual transaction opportunities, its technical ability to process data, market entry barriers, and use factors such as user effect as auxiliary judgment criteria to comprehensively consider whether it has a dominant market position. In specific cases, the operational characteristics of the platform and the types of services it provides should be discriminated, and factors such as whether the market entity has significantly higher profitability in related products and services compared to other operators and the product dependence of related users should be taken into consideration to infer whether it has a dominant market position. For high-frequency consumption areas such as ride-hailing and food delivery, regional markets can be defined by geofencing; For platforms with high user stickiness, time attention can be used as a market measurement dimension.
5.2. Enforcement level
5.2.1. Innovate regulatory means
Use new law enforcement means such as algorithmic insight mirror regulatory systems and blockchain evidence storage systems to break the "algorithmic black box", promote pre - and post-event regulation, and support the modernization of the market supervision system with information technology, constantly innovate regulatory means and improve regulatory efficiency and quality.
5.2.1.1. Through-the-hole supervision
Strengthen the information disclosure obligations of operators, establish an algorithm filing system, and require platforms to submit technical documentation of the pricing algorithm and certified assessment reports before the algorithm is put into market application, and use them after review to reduce the occurrence of big data "price discrimination" from the source. To achieve a balance between transparency requirements and trade secret protection, the principle of limited transparency should be implemented, that is, core parameters should be reported to regulatory and auditing authorities, and source code review should be open; Provide the public with a "simplified explanation" of algorithmic decisions, explaining the basic principles of the algorithm and establishing "understandable transparency" standards to enable users to understand the pricing logic. Consumers should be informed substantially rather than formally. The price display page should indicate whether the price is generated based on a personalized algorithm. When consumers question the price, the platform should provide an explanation report within a certain period of time.
Establish a digital portrait system for platform pricing behavior, generate risk ratings based on indicators such as historical violation records, price difference dispersion, and user complaint rates, implement targeted supervision, and promote the development of regulatory technology from "monitoring and identification" to "risk early warning".
In addition, law enforcement agencies should also enhance their regulatory capabilities and keep pace with The Times. First, increase investment in data security management equipment, equip with detection devices, and establish a price detection mechanism; [8]Second, conduct business training on relevant technical knowledge and cultivate a group of professional law enforcement personnel; Third, seek technical support and certification from third-party organizations and enterprises, use technical means to regulate algorithmic technology, and improve efficiency as well as the accuracy and objectivity of test results.
5.2.1.2. Dynamic regulatory sandbox
Regulatory authorities can set up real-time monitoring databases of data and use the "crawl review" method to crawl price-related data from websites and conduct correlation comparisons by gender, device type, etc. to screen for abnormal pricing behavior in real time. The platform can run the pricing algorithm in a closed environment, and the regulatory authorities can test it in real time through the API interface, obtain the output results by inputting specific data questionnaires, and thereby crack the pricing algorithm and review model bias. Build a "price transparency chain" using blockchain technology to record the platform's pricing data and ensure that transactions are traceable and immutable.
5.2.1.3. Graded and categorized regulatory mechanisms
Differentiated regulation is implemented based on platform size and risk level. Algorithmic mandatory filing is required for platforms with annual active users of more than 500 million or a market share of more than 30%, and changes to pricing models need to be reported. For platforms with annual active users ranging from 100 to 500 million, sample monitoring will be implemented and they will be required to submit self-inspection reports every quarter; Apply complaint-triggered oversight to platforms with less than 100 million annual active users.
Regulation of big data price discrimination should also follow the three-stage classification principle, and law enforcement agencies should not use fines as the sole means of governance. Set higher fines for malicious "price discrimination" by dominant market platforms to ensure they feel the pain. For occasional violations by medium-sized platforms, flexible measures such as administrative guidance and warning talks should be given priority, and operators should be required to conduct self-examinations and corrections to avoid direct administrative penalties. In judicial practice, an e-commerce platform, after mediation by its staff, promised to fully rectify its pricing algorithm to ensure that all users would be treated equally in the future and that the composition of commodity prices and the reasons for fluctuations would be clearly and explicitly displayed.
5.2.1.4. Cross-domain judicial collaboration
Establish a "data passport" mechanism to enable resource information sharing among regions, further enhance the ability to collect and analyze data information, and obtain potential anti-competitive behavior to the greatest extent. Big data "price gouging" should be uniformly regulated by market supervision and administration departments, collecting, organizing, analyzing, and finally diversion processing to maximize resource conservation, improve the efficiency of administrative law enforcement, and solve law enforcement problems caused by functional conflicts.
5.2.2. Multi-subject collaborative governance mechanism
In order to effectively protect the legitimate rights and interests of consumers in the era of artificial intelligence, it is necessary to work together to govern algorithmic power through self-regulation by operators, guidance by industry associations, supervision by public authorities, and self-protection by consumers, and to make it a norm.
The Cyberspace Administration takes the lead in launching a special campaign on algorithmic governance, establishing a long-term mechanism for interdepartmental coordination and interaction to better fulfill regulatory duties and jointly carry out algorithmic security governance work, with each department responsible for supervising and managing algorithmic recommendation services in accordance with their respective duties. The state promotes the establishment of information sharing platforms to issue trusted certification marks to algorithms that pass fairness tests, fostering healthy competition within the industry. Work with industry, academia and industry organizations to conduct special investigations, draw on international experience, study and introduce regulatory guidelines for typical behaviors such as big data price discrimination, guide industry expectations, establish integrity incentives and a blacklist system for big data price discrimination, and force rule-breaking operators to correct their behavior. If a company's big data "price discrimination" behavior is determined to reach the highest level of the blacklist, it will be required to suspend business for rectification, and the company can only be removed from the blacklist after rectification and review without problems. At the same time, the blacklisted enterprise platform alerts customers that the enterprise has big data "price discrimination" behavior, providing a "double insurance" for consumers' rights and interests to be protected.
Industry associations should strengthen supervision over the pricing behavior of market entities within their industries, effectively restrain the pricing behavior of operators through measures such as setting industry pricing fluctuation standards, standardizing the operation rules of big data analysis technology, and establishing corresponding credit punishment systems, and create a legal and orderly business environment. Industry associations can use questionnaires to summarize the current situation of industry development, put forward constructive suggestions, and name and criticize business enterprises that violate the regulations of industry associations.
Platform enterprises should establish and improve the "whistleblower" system, internalize algorithmic ethics as corporate code, appoint algorithmic supervisors for algorithm verification, let "technology against technology", and proactively improve internal governance mechanisms such as setting up reasonable reporting channels, taking professional screening and investigation measures, promptly verifying relevant facts, and promptly feeding back investigation and handling results, To solve a large number of internal problems more efficiently. The platform could set up a consumer compensation fund for quick settlement of small disputes.
5.2.3. Clarify regulatory principles
Administrative authorities should not impose a one-size-fits-all ban on big data "price discrimination" behavior, nor should they enforce the law excessively. They should adhere to the principle of "inclusive and prudent" regulation, with market regulation as the main approach and government department regulation as a supplement. They should act in accordance with the law, be reasonable and moderate, and treat normal commercial activities such as promotions correctly. Regulation should mainly target areas with significant benefits. To prevent the situation where "letting go leads to chaos and regulation leads to death".
5.3. Judicial level
5.3.1. Make it clear that the burden of proof is reversed
The plaintiff only needs to provide preliminary evidence (such as screenshots of price comparisons between different accounts) to prove that the platform operator has engaged in big data price discrimination, that is, to prove that the price of the same goods or services purchased by oneself is higher than that of other consumers. The operator, who holds a large amount of information and key evidence, bears the burden of proof, explains the algorithm, proves the reasonabiness of the pricing, and bears the adverse consequences of the lawsuit if unable to prove.
5.3.2. Expansion of public interest litigation
Support people's procuratorates at all levels, consumer associations and organizations designated by the state cyberspace administration to file class action lawsuits on behalf of consumers, narrow the gap in litigation power between the plaintiff and the defendant, reduce the cost of individual rights protection, and free individual consumers from complex rights protection procedures. At the same time, the representative litigation rule in the civil litigation system will be applied to extend the res judicality in some "price gouging" cases to consumers who did not participate in the judgment, reduce the waste of judicial resources, improve the efficiency of rights protection, enhance the willingness and enthusiasm of consumers to litigate, and use strong deterrence to stop the "price gouging" behavior of Internet platform operators.
5.3.3. Improve the standards of compensation
Introduce the "consumer residual loss model", which directly sets the amount of damage at the price of the price at which the consumer actually pays the order, simplifies the calculation process and saves litigation costs.Introduce a punitive damages system for minor damages, establish a minimum standard of minor damages for big data price discrimination, increase the cost of violation, increase the amount of punitive damages or impose penalties based on multiples of the actual earnings of platform merchants, allow consumers to claim a refund and triple compensation, and implement the function of judicial regulation of big data price discrimination.
6. Conclusions
The progress of society cannot do without the pioneers of advanced technology, nor can it do without the guardians of fundamental values. Big data "price discrimination" is a core proposition for building a fair trade order in the digital economy era. Its regulation needs to break through the traditional consumer protection paradigm and start from legislation, law enforcement and judiciary, with all subjects working together to curb the expansion of algorithmic hegemony, achieve the organic unity of "technology for good" and "market for public", protect the legitimate rights and interests of consumers and the normal market order, To lay a solid legal foundation for the construction of Digital China.
References
[1]. Yu Ling, "Misinterpretation and Clarification of the Anti-Monopoly Law Attributes of Algorithmic Consumer Price Discrimination, " Legal Science, No. 9, 2020, pp. 83-99.
[2]. Zhu Chengcheng, "Analysis of the Illegality of Big Data Price Discrimination and Exploration of Legal Regulation: An Analysis from the Perspective of Consumer Rights Protection, " Southern Finance, No. 4, 2020, pp. 92-99.
[3]. Huang Yi and Song Ziyin, "Legal Regulation of 'Algorithmic Price Discrimination' in the Context of Big Data", Zhongzhou Academic Journal, No. 4, 2022.
[4]. The Civil Judgment No. 3129 of 2021 issued by the Intermediate People's Court of Shaoxing City, Zhejiang Province.
[5]. Liu Quan, "On the Data Reporting Obligation of Network Platforms", Contemporary Law, No. 5, 2019.
[6]. Hu Xiaohong, "The Construction of the 'Gatekeeper' Obligation in China's Platform Economy Field from the Perspective of Anti-Monopoly Law", Xuehai, No. 2, 2023.
[7]. Hu Bin, "The Theory of Competition and Cooperation of Administrative Legal Norms and the Construction of Applicable Rules: An Analysis Based on 123 Judicial Documents, " Administrative Law Research, No. 2, 2022, pp. 139152.
[8]. Pan Ding and Xie Han, "The Evolutionary Game between Government Regulation and E-commerce Enterprises' 'Price Discrimination' Behavior in the Digital Economy, " Economic and Management, No. 1, 2021, pp. 77-84.
Cite this article
Tan,H. (2025). The Legal Characterization and Regulation of Big Data "Price Discrimination" Behavior. Lecture Notes in Education Psychology and Public Media,112,54-64.
Data availability
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
Disclaimer/Publisher's Note
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
About volume
Volume title: Proceedings of ICILLP 2025 Symposium: Digital Governance: Inter-Firm Coopetition and Legal Frameworks for Sustainability
© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and
conditions of the Creative Commons Attribution (CC BY) license. Authors who
publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons
Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this
series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published
version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial
publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and
during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See
Open access policy for details).
References
[1]. Yu Ling, "Misinterpretation and Clarification of the Anti-Monopoly Law Attributes of Algorithmic Consumer Price Discrimination, " Legal Science, No. 9, 2020, pp. 83-99.
[2]. Zhu Chengcheng, "Analysis of the Illegality of Big Data Price Discrimination and Exploration of Legal Regulation: An Analysis from the Perspective of Consumer Rights Protection, " Southern Finance, No. 4, 2020, pp. 92-99.
[3]. Huang Yi and Song Ziyin, "Legal Regulation of 'Algorithmic Price Discrimination' in the Context of Big Data", Zhongzhou Academic Journal, No. 4, 2022.
[4]. The Civil Judgment No. 3129 of 2021 issued by the Intermediate People's Court of Shaoxing City, Zhejiang Province.
[5]. Liu Quan, "On the Data Reporting Obligation of Network Platforms", Contemporary Law, No. 5, 2019.
[6]. Hu Xiaohong, "The Construction of the 'Gatekeeper' Obligation in China's Platform Economy Field from the Perspective of Anti-Monopoly Law", Xuehai, No. 2, 2023.
[7]. Hu Bin, "The Theory of Competition and Cooperation of Administrative Legal Norms and the Construction of Applicable Rules: An Analysis Based on 123 Judicial Documents, " Administrative Law Research, No. 2, 2022, pp. 139152.
[8]. Pan Ding and Xie Han, "The Evolutionary Game between Government Regulation and E-commerce Enterprises' 'Price Discrimination' Behavior in the Digital Economy, " Economic and Management, No. 1, 2021, pp. 77-84.