Bill

Bill > A1324


NJ A1324

NJ A1324
Requires social media companies to take certain actions concerning accounts maintained by or featuring minors.


summary

Introduced
01/13/2026
In Committee
01/13/2026
Crossed Over
Passed
Dead

Introduced Session

2026-2027 Regular Session

Bill Summary

This bill requires social media companies to develop and use on its social media platform: (1) algorithms that detect suspicious patterns and flag potentially inappropriate activity, including adult interactions with minors, private messaging frequency between adults and minors, and attempts to establish inappropriate relationships with minors; and (2) a real-time monitoring system that continuously analyzes social media content and identifies potentially inappropriate activity involving minors, including an automated reporting mechanism that promptly reports identified instances to the appropriate authorities. Additionally, a social media company is required to: (1) prohibit minors from appearing in the results of a search conducted by a person through the social media platform's search function unless the person holds an account that the minor user has previously added; (2) identify features and content that are inappropriate for an account holder who is a minor to access and use geo fencing to restrict the minor's access to such content or features; (3) prioritize and handle reports of inappropriate activity involving minors by directing reports to the appropriate local authorities or child protection agencies based on the account holder's location; (4) collaborate with law enforcement agencies, child protection agencies, and legal experts to ensure compliance with privacy laws and regulations; (5) send safety alerts and notifications to an account holder in a specific geographic area that has an increased risk of child exploitation; and (6) conduct regular audits and assessments to evaluate the effectiveness of the implemented monitoring and reporting measures and make necessary improvements. This bill also requires that, in the event the account holder uses a minor's name or likeness in more than 25 percent of the sponsored content for the account, the social media platform is required to verify that the account holder is the parent or guardian of the minors portrayed in the account holder's sponsored content and is the primary account holder. Upon verification, the social media platform is required to place a public banner on the account holder's account page that clearly indicates the account holder is an adult and the account may appear in the results of a search conducted by any account holder through the social media's platform's search function. The Division of Consumer Affairs (division) is responsible for enforcement of the bill's provisions. The division is required to investigate consumer complaints alleging violations of, and enforce the provisions of this bill. The division is authorized to impose a civil penalty of up to $2,500 for each violation or to initiate a civil suit in Superior Court. In addition, an individual may bring an action in the Superior Court against a social media company, for failure to comply with the provisions of this bill. If the individual's suit is successful, the individual is entitled to reasonable attorney fees and court costs. The individual is also entitled to either actual damages or $2,500, whichever is greater.

AI Summary

This bill requires social media companies to implement advanced safety measures to protect minors online, including developing algorithms to detect suspicious activity like inappropriate adult interactions and frequent private messaging between adults and minors, and using real-time monitoring systems to identify and report potentially harmful content involving children to authorities. Social media platforms will also be restricted from showing minors in search results unless the searcher is already connected to them, and will use geo-fencing, which creates virtual boundaries, to block minors from accessing inappropriate content or features. The bill mandates that companies prioritize reports of child exploitation, collaborate with law enforcement and child protection agencies, send safety alerts to users in high-risk areas, and regularly assess their safety measures. Furthermore, if an account holder uses a minor's name or likeness in over 25% of their sponsored content, the platform must verify the account holder is a parent or guardian and display a public banner indicating they are an adult. The Division of Consumer Affairs is tasked with enforcing these provisions, with the authority to impose civil penalties of up to $2,500 per violation or pursue civil suits, and individuals can also sue social media companies for non-compliance, potentially recovering attorney fees, court costs, and either actual damages or $2,500, whichever is greater.

Committee Categories

Business and Industry

Sponsors (3)

Last Action

Introduced, Referred to Assembly Science, Innovation and Technology Committee (on 01/13/2026)

bill text


bill summary

Loading...

bill summary

Loading...

bill summary

Loading...