China’s new draft rules for apps: what they signal for marketing, AdTech, and DSPs in 2026
- See Qian

- Jan 13
- 6 min read
This article is part of the series on China’s regulatory policy updates, reports and advertising laws. If you have missed out on any of the articles from this series, you can click the following titles to read more – Anti-Monopoly Crackdown, PIPL, Advertising Law, October Update, November Update, December Update, January Update (XiaoHongShu Bans 39 Brands), Medical Advertising, March Update, IP Location Regulations, Didi US$1.2 Billion Fine, Measures for Security Assessment of Data Export, Provisions on the Administration of Internet Pop-Up Information Push Services, Internet Protection of Minors, China's State Secrets Law (Second Revision),China Tightens Data Privacy
On January 10, 2026, China’s internet regulator (the Cyberspace Administration of China / 国家互联网信息办公室) published a new draft regulation for public comment: 《互联网应用程序个人信息收集使用规定(征求意见稿)》 (“Draft Provisions on the Collection and Use of Personal Information by Internet Applications”). Feedback is open until February 9, 2026.
If you work in marketing or AdTech, the headline is simple: China is tightening the “rules of the road” for how apps, SDKs, app stores, and device operating systems handle personal data—especially when that data is used for personalisation and commercial marketing. The practical effect is not “ads are banned,” but rather targeting and data-sharing become more permission-ed, more visible, and more accountable across the whole supply chain.

Below is deep dive into what’s in the proposal—and what brands and marketers should do next.
What the proposal is and why it matters
The draft is designed to standardise and strengthen compliance for personal information collection and use in the app ecosystem. It explicitly covers not just:
Apps (including mini-programs and quick apps),
SDK operators embedded in apps,
App distribution platforms (app stores, mini-program platforms, etc.),
Smart terminals / operating systems (permission systems, OS-level prompts, and logs).
It is anchored in existing national frameworks, particularly:
China’s Personal Information Protection Law (PIPL) (in effect since 2021),
the Network Data Security Management Regulation (国务院《网络数据安全管理条例》, effective 2025),
and ongoing enforcement campaigns targeting apps and SDKs that over-collect or fail to disclose data practices.
So while it’s “new,” it’s best understood as: turning broad legal principles into operational, checkable rules for the app economy.
What’s inside the draft: the “plain English” highlights
1) Stronger “what you collect and why” disclosure—down to feature level
Apps would need to publish rules in clear language and, importantly, in structured lists that spell out—for each function—the purpose, method, and data categories collected, plus what permissions are called and how often.
This is a big deal because it pushes the ecosystem away from vague, one-size privacy policies and toward auditable disclosure.
2) SDK transparency becomes non-optional
If an app embeds SDKs, it must list them in a structured way (name/package, version, operator, what data the SDK collects) and include a link to the SDK’s own rules.
The draft also places direct responsibilities on SDK operators: publish rules, avoid collecting beyond what they declare, and provide the ability to support opt-outs (including for personalized recommendations/marketing).
This is especially relevant for AdTech, where SDKs often sit at the center of data collection and audience building.
3) “Separate consent” for sharing data with third parties gets reinforced
The draft states that if an app provides personal information to a third party, it should obtain the user’s separate consent.
This aligns with PIPL Article 23, which also requires separate consent when providing personal information to another processor.
Commercial takeaway: Any “data export” or third-party enrichment workflow becomes harder to justify without clean consent and clear user-facing explanations.
4) Personalized marketing and recommendation: clearer opt-out expectations
The draft says that where apps use automated decision-making to push information or conduct commercial marketing, they should provide an easy-to-use option to turn off personalized recommendations, and if the user turns it off the app must stop using the relevant personal information for that purpose.
This tracks the direction already in PIPL Article 24, which requires a non-targeted option or a convenient refusal mechanism for push marketing based on automated decision-making.
Commercial takeaway: “Personalized ads by default” becomes riskier—especially when consent and opt-out UX are weak.
5) App stores and platforms become enforcement “gatekeepers”
Distribution platforms would have explicit obligations to:
strengthen app review,
maintain compliance records,
refuse listing if key elements are missing (e.g., rules, account deletion),
and display permissions and privacy links clearly on download pages.
They’d also need to review existing apps within a defined period after the regulation takes effect.
Commercial takeaway: non-compliant apps risk distribution friction, which can affect campaign reach and stability.
6) Operating systems and devices: more visible permissions and logging
Smart terminals/operating systems would be expected to provide more granular permission controls, visible indicators for sensitive access (mic/camera/location), and centralized logs of key permission calls and certain device-level data collection behaviors (including device identifiers and clipboard).
Commercial takeaway: stealthy or always-on collection gets harder, and “shadow data” becomes more discoverable.
What this means for marketing and AdTech in China

A) Third-party data gets harder to activate at scale
For DSPs and data providers that rely heavily on third-party segments, the combined effect of:
structured disclosure requirements,
third-party sharing consent expectations,
distribution platform oversight,
OS-level transparency, is that data supply becomes more compliance-sensitive.
This doesn’t eliminate third-party data, but it raises the bar on:
proving the data was collected within a legitimate purpose,
showing users were properly informed,
and ensuring users can opt out of personalization/marketing where required.
Likely outcome: some “long tail” data brokers/segments become less reliable, while premium, well-governed ecosystems become more valuable.
B) Targeting strategies shift toward “permissioned” and “platform-native”
Expect continued momentum toward:
first-party data strategies (CRM, loyalty, membership),
publisher/platform first-party audiences (where the platform can support consent, controls, and auditability),
contextual and content-based targeting (less dependent on persistent identifiers), and “cleaner” partnerships where responsibilities are contractually clear.
This is already consistent with enforcement direction—CAC has publicly called out apps/SDKs for inadequate disclosure and rule publication, signaling scrutiny on the plumbing of data collection.
C) Measurement and attribution: more pressure on transparency and minimisation
When OS and platform rules increasingly surface permission use and data practices, measurement approaches that rely on opaque identifiers or unnecessary permissions face higher scrutiny.
That typically pushes the market toward:
measurement using aggregated and privacy-safe approaches,
stronger governance of partner SDKs,
and more disciplined data retention and purpose limitation.
D) The “UX of consent” becomes a competitive lever, not a compliance footnote
Because the draft emphasizes clear disclosures, prominent notices, and opt-outs, brands should assume that:
consent prompts and privacy explanations will directly affect addressability,
“dark patterns” or bundled consent may carry higher risk,
and better-designed flows can preserve performance while reducing compliance exposure.
In other words: privacy UX becomes part of growth strategy.
What brands and marketers should do now
This is a draft, and the final text may change—but the direction is clear enough to act on immediately.
1) Map your China data supply chain (end to end)
For each major campaign path, document:
where user data originates,
which SDKs are involved,
whether any data is shared onward,
and what disclosures/consents exist at each step.
If you can’t explain it simply, it will be hard to defend later.
2) Stress-test your targeting: “Would this work if opt-outs increase?”
Assume more users will see and use opt-outs for personalization, and that some inventory partners may restrict certain segments.
Build plan B options:
contextual packages,
higher-quality publisher PMP strategies,
and platform-native audiences where compliance controls are strongest.
3) Tighten partner expectations (DSPs, DMPs, data providers, publishers)
Move from marketing-language promises (“compliant data”) to operational proof:
what permissions are used,
what the declared purpose is,
whether separate consent is required and how it’s obtained,
what the deletion/withdrawal path looks like.
4) Reduce “unnecessary data” to protect performance
Data minimization isn’t just legal hygiene—it’s commercial risk management. If a segment depends on fragile permissions or questionable collection practices, it’s a performance liability.
Prioritize segments that are:
purpose-bound,
well-documented,
and less exposed to policy churn.
5) Turn privacy into a brand trust story
In China, trust and safety narratives can be powerful. If your brand can credibly say:
“we only use what we need,”
“you can control personalization,”
“we work with vetted partners,” that’s not only compliant—it’s a differentiator as the ecosystem tightens.
The bottom line
China’s app privacy governance is moving toward a world where every party in the chain is accountable: apps, SDKs, app stores, and operating systems.
For AdTech and DSPs, this signals a continued shift away from “invisible” third-party data practices and toward:
permissioned data,
platform-native audiences,
better transparency and user control, especially around personalized marketing.
For brands, the winning play isn’t to wait—it’s to rebalance now: invest in first-party foundations, choose partners with strong governance, and design campaigns that stay resilient even as opt-outs and platform enforcement increase.
Contact us to find out more on the latest updates on China’s regulatory policies.




Comments