Own Content? Then Own Up Responsibility

Own Content? Then Own Up Responsibility is a news report published by Moneycontrol on 28 August 2019, written by Sunny Sen. The article examines government proposals to hold social media platforms legally liable for content they create or curate, effectively narrowing the intermediary safe harbour protection that has shielded platforms from user-generated content.

Contents

  1. Article Details
  2. Full Text
  3. Context and Background
  4. External Link

Article Details

📰 Published in:
Moneycontrol
✍️ Author:
Sunny Sen
📅 Date:
28 August 2019
📄 Type:
News report
📰 Newspaper Link:
Read Online

Full Text

The Centre plans tighter rules for social media brands like Facebook, TikTok and ShareChat


The government is planning to impose higher levels of accountability on social media platforms as it grapples with the problem of bringing about order in a fast-growing industry where regulations are still nebulous.

One important measure it is considering is to tell social media brands such as TikTok, Facebook and ShareChat that they will be legally liable for content that they have had a hand in either creating or curating. Which means that even if there is the slightest fingerprint of a social media company on a piece of content, platforms cannot claim to be mere intermediaries and disclaim responsibility for consequences.

"Social media companies can't bring out original content or they should take responsibility for them," said a senior government source, explaining the Centre's thinking on the issue.

The explosion increase in user-generated content, especially short videos, has become a regulation headache for the authorities. When user-generated social media content crosses the bounds of decency, spreads hate or propagates fake news, intermediary status also confers legal immunity because the platforms can claim they do not know what the user is putting up unless an individual or software raise a red flag.

While in the case of traditional media such as newspapers and television there is editorial control over what is printed or goes on air, social media is still a free-for-all world. Social media companies have so far argued that they are only intermediaries, and users generate content over which they have no control. But in practice it is not all that clear-cut.

"Safe harbour is for non-curated content," said Subho Ray, president of Internet and Mobile Association of India (IAMAI). "Safe harbour is not applicable to platform, but to the piece of content. If content is curated by a company they can't claim safe harbour because if you are curating it or have exclusive rights over it, you have seen it."

The government is also considering to stop intermediaries from having exclusive user-generated content on the platform. "Discussions are on, but there is no decision on that yet," said another source.

Sunil Abraham, Executive Director of Bangalore-based research organisation, Centre for Internet and Society said, "An intermediary is providing a two-sided market. If they participate in that market there could be competition harms."

For context, TikTok owned by Chinese internet conglomerate ByteDance sent a notice to ShareChat to take down content for which the former had signed exclusive rights. ShareChat took it off, but also sent a letter to Ajay Sawhney, Secretary of Ministry of Electronics and Information Technology (MEITY) on August 23, copy of which is with Moneycontrol, asking for clarity on laws governing intermediaries.

"Instead of acting as intermediaries (that are protected by safe harbour liability exemptions), such exclusivity deals result in these platforms being considered broadcasters or streaming services (and therefore directly liable for the nature of the content distributed by them)," Berges Y Malu, Head of Public Policy and Policy Communications at Mohalla Technology Pvt (owners of ShareChat) wrote in the letter.

TikTok engages with users who can promote the platform and teach other users on how to use it. It also encourages and incentivises content creation by some of these users, but does not exercise any editorial control over content creation.

"TikTok may enter into mutual contractual agreement with some creators, where TikTok may enjoy certain exclusivity rights over the content of these creators," said a TikTok spokesperson commenting on ShareChat sending a letter to the government. "In this regard, TikTok has undertaken legal action as part of its commitment to protect its users from copyright infringement."

But, there is a catch there. "They can claim all rights. Because user had granted such a liberal licence. But the user as the copyright holder can licence it again and again to multiple parties because these licences are non-exclusive," said Abraham.

Back to Top ⇧

Context and Background

The report appeared in August 2019, as short-video platforms like TikTok and ShareChat were experiencing rapid growth in India, particularly among users in smaller cities and towns. This expansion brought increasing scrutiny from regulators concerned about hate speech, misinformation and content that violated Indian law, yet struggled to fit neatly into existing legal categories designed for either traditional publishers or passive technical intermediaries.

The dispute between TikTok and ShareChat over exclusive content deals illustrated the blurring boundaries between platform and publisher. TikTok’s practice of signing exclusivity agreements with popular creators raised questions about whether such arrangements transformed the platform from a neutral host into an active participant in the content market, thereby forfeiting safe harbour protections under the Information Technology Act.

Sunil Abraham’s observation about competition harms pointed to a separate concern: that exclusive deals could distort the two-sided market structure platforms claim to facilitate, concentrating power in ways that harm both creators and competing platforms. His analysis of copyright licensing further highlighted the legal ambiguity, noting that non-exclusive licences allow creators to distribute the same content across multiple platforms regardless of contractual claims.

The government’s proposal to condition safe harbour protection on whether platforms create or curate content reflected broader international debates about platform liability. However, critics worried that vague standards around what constitutes curation could prompt over-moderation or give authorities excessive leverage over platforms, particularly given the subjective nature of determining when a platform’s “fingerprint” appears on user content.

📄 This page was created on 21 December 2025. You can view its history on GitHub, preview the fileTip: Press Alt+Shift+G, or inspect the .