As of 2023, over 90% of U.S. households own at least one smart TV, according to the Consumer Technology Association. These devices offer seamless integration with streaming platforms, voice assistants, and personalized viewing—benefits powered by continuous data collection. Embedded software tracks what users watch, how long they watch, and even how they navigate menus. This information fuels content recommendations and advertising, but it also raises serious questions about consent and transparency.
Concerns over digital surveillance have already triggered legal responses. In recent years, companies like Vizio have faced lawsuits and regulatory fines for unauthorized tracking of viewing habits. Now, the Texas Attorney General is escalating the issue by filing lawsuits against Samsung and Hisense. The suits allege that both manufacturers failed to obtain proper user consent before collecting and monetizing personal viewing data, marking a new phase in state-level enforcement of consumer privacy rights.
The Office of the Attorney General of Texas, under Ken Paxton, has a track record of aggressively pursuing legal actions against major technology firms. This includes previous antitrust lawsuits against Google and Meta, positioning Texas at the forefront of state-led tech scrutiny. The state's legal actions aren't isolated—they reflect a broader trend of state attorneys general taking action where federal regulators have moved more slowly.
On October 30, 2023, Attorney General Paxton announced civil lawsuits against two major smart television manufacturers: Samsung Electronics Co. and Hisense USA Corporation. These companies, which together hold a significant share of the U.S. smart TV market, are under fire for alleged consumer privacy violations. The lawsuits accuse both firms of using their smart TVs to track viewing behavior without properly informing users or obtaining explicit consent.
This legal move fits into a growing cascade of state-led efforts to confront Big Tech on privacy, antitrust, and consumer protection issues. Over the past five years, attorneys general from California to New York have launched investigations, lawsuits, and policy frameworks targeting how tech integrates surveillance and data monetization into everyday devices. Paxton’s filing marks another flare in a wider legal and political uprising against opaque data practices in connected technologies.
In his public statement released on the day of the filing, Paxton explained, “Modern televisions have evolved into sophisticated data-collection devices that can record and transmit extremely personal information about users—usually without any meaningful notice or choice.” He pointed to deceptive trade practices, violations of the Texas Deceptive Trade Practices Act (DTPA), and lack of transparency in consumer agreements as key drivers of the lawsuit. His office emphasized that the actions were not just about smart TVs, but about creating legal boundaries for how consumer data is collected in personal spaces.
What do these suits signal about the balance between innovation and surveillance? The courtroom may soon provide answers.
Texas claims that smart TV manufacturers, including Samsung and Hisense, silently tracked consumer behavior without approval. These devices, often installed in living rooms and bedrooms, allegedly gathered detailed information about users’ viewing habits using a technology called Automatic Content Recognition (ACR).
ACR scans on-screen content, whether from cable, streaming platforms, or even Blu-ray discs. By mapping this content against a vast database, the system identifies what the viewer is watching in real-time. Prosecutors argue that this tracking took place without consumers being fully aware or giving express permission.
Consent, according to the Texas Attorney General’s office, was either buried in fine print or bypassed entirely during device setup. Many consumers proceed through installation steps quickly, and default settings often favor data collection. The lawsuit alleges that manufacturers designed the onboarding experience to obscure the data-sharing agreement.
Although user interfaces may reference “personalized content” or “improving product services,” the state argues this language is vague. In several instances, opting out of data collection was made difficult or was not clearly explained. This undermines the principle of informed consent, which requires clarity, transparency, and active user participation.
Beyond simple data collection, the complaint accuses Samsung and Hisense of profiting directly from the user data of Texans. That includes selling or licensing anonymized profiles to third-party advertisers. Viewing activity, combined with demographic information, creates highly valuable advertising segments.
According to the lawsuit, not only did the companies fail to explicitly disclose this exchange, but they also did not share how the data would be used across digital ecosystems. This commercial use of personal data, the state argues, violates consumer trust and undermines the right to control one’s digital footprint. For example, a viewer’s nightly routine of streaming crime thrillers or family-friendly favorites becomes part of a profile sold to advertisers, often without that viewer ever realizing it.
Texas positions these practices as deceptive, accusing manufacturers of benefitting financially while consumers remain uninformed and powerless. This legal offensive places the spotlight on how smart TV companies leverage personal data and whether they overstep legal and ethical boundaries in the process.
Smart TVs do more than stream shows; they collect high-resolution behavioral data. With automatic content recognition (ACR) technology, these devices identify not only what is being watched but when, for how long, and how often. Over time, this creates a detailed behavioral profile of the household.
This information, when analyzed, can infer sensitive insights such as religious views, political leanings, mood cycles, income level, and even household routines. For example, frequent viewing of financial news may suggest a user’s investment habits or economic concerns, while consistent programming choices during weekday afternoons might indicate someone working from home or managing health challenges.
Unlike traditional television sets, smart TVs function as two-way surveillance devices. They monitor content inputs across platforms—broadcast, streaming, consoles—and often send this data back to third-party servers for monetization. In effect, homes become data farms, and every click or pause contributes to a robust profile used beyond the living room.
This surveillance operates silently. There are no pings, alerts, or opt-ins that ensure users understand the extent of logging activity. Consumers aren’t merely watching; they're being watched.
Across Texas, residents express growing discomfort and even outrage. There's a sharp reaction to what many see as an erosion of personal agency. People expect more control over the boundaries between their private lives and corporate interests.
Ethically, Texans question the fairness of collecting intimate personal habits without transparent communication or explicit consent. Emotionally, many connect this issue to broader fears about overreach—whether it's from big tech or from institutions profiting off private behavior without scrutiny.
Texas law requires companies to obtain clear, informed, and affirmative consent before collecting or sharing consumers’ data. This means users must be told exactly what data is being collected, how it will be used, and be given a meaningful choice to agree—or not. Passive notices buried in settings menus or pre-checked boxes don’t meet this standard. Without explicit opt-in, consent becomes a fiction.
Several smart TV manufacturers named in the Texas lawsuit allegedly relied on design tactics that obscured, downplayed, or completely bypassed this legal requirement. In practice, this meant consumers often unknowingly consented simply by plugging in the TV or clicking through a quick startup guide.
Smart TVs routinely deploy automatic content recognition (ACR) systems to track viewing habits. These systems capture data not just from streaming apps, but also from over-the-air broadcasts, cable input, and even what’s played through a connected device like a gaming console. The issue isn't the tracking per se. It’s the method of soliciting—or failing to solicit—user agreement.
According to the complaint filed by the Texas Attorney General, some companies bundled data-sharing permissions deep within lengthy, non-negotiable user agreements. In other cases, ACR features were activated by default, requiring users to opt out after setup—a model Texas law explicitly rejects.
Privacy experts describe these practices as a textbook case of “dark patterns”—user interface designs crafted to coerce or confuse. Dr. Jennifer King, Privacy and Data Policy Fellow at Stanford University, said in a recent interview that these patterns "weaponize design against the user" and "...erode informed choice in a space where clear consent is the only barrier against exploitation."
Electronic Frontier Foundation (EFF) analysts have criticized smart TV manufacturers for treating consent as a compliance checkbox rather than a fundamental user right. One EFF researcher notes that, "The illusion of choice has become standard. Users weren't asked—they were nudged, hurried, and misled."
Without transparency, consent ceases to be valid. And in Texas, that legal distinction translates directly into liability.
The lawsuits filed by the Texas Attorney General’s office against smart TV giants Samsung and Hisense don’t just allege privacy violations—they outline a roadmap for both accountability and reform. Concrete outcomes are being pursued on multiple fronts, each intended to reshape how tech companies operate within state borders.
The Attorney General seeks monetary damages from both Samsung and Hisense. If the court confirms that these companies violated Texas privacy or consumer protection laws, financial sanctions will follow. These penalties serve a dual purpose: compensate for consumer harm and disincentivize future misconduct. Under Texas law, violations of the Deceptive Trade Practices–Consumer Protection Act (DTPA) can lead to civil penalties of up to $20,000 per violation.
One major goal behind the legal action is to compel changes in how smart TV manufacturers handle personal information. The state’s legal team calls for enforceable commitments to build privacy controls into hardware and software, especially around data collection and transmission. Rather than vague promises, the lawsuits aim to embed privacy protections into the design and operation of the devices themselves.
Beyond the immediate parties involved, these lawsuits are intended to set a legal precedent. If the court affirms that collecting viewing habits, device identifiers, and location data without proper consent violates state laws, other technology manufacturers may face similar challenges. This could shape the next wave of enforcement, especially in jurisdictions without comprehensive data privacy laws.
The complaints emphasize a lack of clear, upfront disclosure about data practices. By bringing these cases to court, the state seeks to force companies to reveal what data they collect, how they process it, and with whom they share it. Greater transparency will make it harder for companies to hide behind technical language or bury disclosures inside lengthy terms-of-use agreements.
The clear intent is not only to discipline past behavior but to influence future design decisions in the tech industry. Whether the court enforces these demands will shape the conversation around digital privacy far beyond Texas borders.
The lawsuit brought by Texas against leading smart TV manufacturers isn’t happening in isolation. It belongs to a growing pattern: U.S. states stepping into the ring to challenge the unchecked influence of Big Tech. While federal efforts toward comprehensive privacy regulations remain stalled in Congress, state governments are driving accountability forward through litigation and legislation.
States have increasingly become the frontline defense in technology-related legal battles. In 2020, a coalition of 48 attorneys general, led by New York, filed an antitrust lawsuit against Facebook, accusing it of using its market dominance to stifle competition and exploit user data. Then in 2021, dozens of states sued Google, charging it with monopolistic practices in its app store operations.
California has moved beyond lawsuits. With the enactment of the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA), the state created the most robust data privacy framework in the U.S., granting consumers broad control over how their information is collected and used. Colorado, Virginia, and Connecticut followed suit with their own variations, establishing a patchwork of privacy laws filling the vacuum left by federal inaction.
The Texas lawsuit signals a strategic shift: states are no longer focusing only on tech behemoths like Google or Meta. By targeting smart device manufacturers, attorneys general are widening the scope of accountability to include hardware companies profiting from user data. If judicial rulings or settlements favor the states, expect a wave of regulations focusing on data collection practices specific to smart devices, particularly TVs, speakers, and wearables.
No current federal statute exclusively governs the privacy standards for consumer IoT (Internet of Things) devices. That gap gives states significant discretion to act. And some, like New York and Illinois, are reportedly drafting bills aimed at biometric consent and “always-on” device monitoring.
The legal tug-of-war between state privacy laws and the dormant federal framework is becoming increasingly visible. Without a single, unified data privacy law at the federal level, companies are forced to navigate a patchwork of state regulations, each with different thresholds for consent, data sharing, and user rights.
This fragmentation could lead to clashes where federal courts may be asked to decide whether certain state rules overstep or conflict with national commerce regulations. However, in the absence of preemptive federal legislation, states retain considerable power to enforce consumer protection under existing consumer fraud and deception statutes.
As state prosecutors gain traction in courts and legislatures, the era of unregulated smart technology is nearing a turning point. The Texas case isn’t merely about televisions; it opens the door for deeper scrutiny across the entire digital landscape.
Smart TVs do more than stream content—they act as data collection hubs. Manufacturers embed software that tracks viewing behavior, app usage, voice commands, and even on-screen surfed content. This raw data isn't stored idly. It's packaged, analyzed, and monetized through partnerships with advertisers, data brokers, and analytics firms.
According to a 2020 study by Princeton University and the University of Chicago, nearly 90% of smart TVs send data to third parties, including platforms like Google, Facebook, and Netflix—even if the user never interacts with those companies directly. The data economy operating behind the screen is both vast and largely hidden from the average consumer.
Smart TVs often lure consumers with built-in apps, cloud storage options, personalized content, or voice control capabilities—features labeled as 'value adds'. These perks, however, are subsidized by user data. The trade-off is simple: enjoy seamless services and targeted recommendations in exchange for behavioral tracking.
Samsung’s privacy policy, for instance, notes that it may collect and share viewing history to ‘provide recommendations’ and ‘improve services’, a phrasing that masks how deeply integrated data monetization is in the business model. Consumers gain convenience but surrender visibility into what their TVs record and how it's sold.
This question now shapes the legal and ethical debate. When TVs become two-way devices and consumers aren't fully aware of data flows, the issue shifts from service delivery to exploitation. Unlike subscription fees or upfront hardware costs, data collection isn't itemized on a receipt. That's why critics argue that in this model, consumers aren’t just users—they are the product.
Behavioral profiles built from smart TV data can influence advertising pricing, content placement, and even political microtargeting. The stakes go beyond commercial activity into the fabric of digital identity.
Major tech firms now face pressure from regulators, advocacy groups, and increasingly skeptical users to adopt ethical data practices. Investors are paying attention too. ESG (Environmental, Social, and Governance) reports increasingly assess transparency and consumer trust metrics, placing data ethics on the corporate responsibility radar.
The economic model driving smart TVs isn't going away, but it’s evolving under the weight of legal pressure, public demand, and shifting market expectations. As the lawsuits in Texas illustrate, the conversation about smart technology has moved from convenience to consequence.
In the absence of a comprehensive federal data privacy law, Texas has turned to existing state statutes to pursue major smart TV manufacturers. The Attorney General’s office invoked provisions of the Texas Deceptive Trade Practices Act (DTPA), which prohibits businesses from engaging in false, misleading, or deceptive acts. Under this law, failing to inform consumers about data collection practices—or obtaining consent in a misleading way—constitutes a violation.
However, the DTPA was enacted in 1973, well before smart devices redefined personal technology. While it offers flexibility for enforcement, it lacks specific language addressing digital surveillance, algorithmic profiling, or real-time data tracking embedded in modern devices. This legal stretching creates a reactive instead of preventative approach to privacy protection.
Unlike Europe’s General Data Protection Regulation (GDPR), which standardizes privacy rights across the continent, the U.S. has no unified framework to govern personal data collection across smart devices. A handful of states—California with its California Consumer Privacy Act (CCPA) and Virginia with the Virginia Consumer Data Protection Act—have passed their own laws. Each brings a different scope, consumer rights model, and level of corporate responsibility.
This patchwork means a consumer in Dallas may be significantly less protected than one in San Francisco. For manufacturers selling products nationwide, the uneven requirements contribute to compliance loops and enable data harvesting in jurisdictions with weaker enforcement. The absence of a federal standard invites inconsistency not only in consumer protections but also in corporate accountability.
Pushing lawsuits against multinational corporations for privacy violations is no longer a matter of tracking paper trails. The use of obfuscated software layers, cross-device identifiers, and dynamic consent interfaces creates technical barriers even skilled state prosecutors struggle to penetrate. Smart TV operating systems can silently collect voice commands, location data, and app viewing histories, sending the information to third parties in milliseconds—well before a consumer reads a disclosure, if one exists.
Auditing these systems requires access to source code, server logs, and internal documents that manufacturers rarely volunteer. Even when subpoenaed, data delivery can be delayed, redacted, or contained in formats difficult to interpret. At the enforcement level, most states lack the technical personnel to match legal strategy with forensic digital analysis. The imbalance favors companies that can outspend, outmaneuver, and often outlast state-led proceedings.
Attorneys general from multiple states, including Texas, have jointly called for a national baseline privacy standard. The bipartisan consensus: the situation is unsustainable. In 2023, a coalition of 10 attorneys general submitted recommendations to Congress for legislation that:
So far, no proposed bill has passed both houses of Congress. Political divides over preemption, private right of action, and enforcement authority have left each proposal stalled. Meanwhile, consumers remain exposed to silent data collection, and attorneys general continue to stretch outdated statutes in pursuit of modern accountability.
While Texas sues major smart TV makers over privacy concerns, users shouldn't wait on legal outcomes to take action. Smart TVs operate as sophisticated data collection devices—every setting unchecked or consent box ignored potentially opens a door into home life. Consumers can make deliberate choices now to limit data exposure and assert digital boundaries.
Most consumers don't read privacy policies. Corporations rely on this. Next time a smart TV prompts you to accept a policy, click through and scan for data-sharing terms, particularly those involving third-party advertisers or content recognition services. Look for sections titled “Data Use,” “Information Sharing,” or “Consumer Choices.” Corporate language may cloak intent, but mentions of analytics, personalization, or improvement of services usually signal monetized tracking.
How many devices in your living room are tracking you right now? Start with the one that's mounted on the wall, and claim back control one setting at a time.
Texas stepping into the legal ring with global giants like Samsung, Hisense, and other smart TV manufacturers marks more than a regional privacy dispute—it signals a shift in the balance of power. At the center of this legal action lie unresolved questions about user consent, data collection practices, and corporate transparency. These aren’t speculative issues. They point directly to the mechanics of how major tech companies extract value from consumers without meaningful disclosure.
No matter how sleek the interface or advanced the features, a smart TV doubles as a data-harvesting tool. Texans are pushing back, arguing that users didn’t knowingly agree to participate in that exchange. When the technology in your living room behaves more like an ad-tech vehicle than a home appliance, enforcement of privacy laws becomes not just relevant—but necessary.
The lawsuit’s implications stretch far beyond state borders. This case could embolden other states or countries to launch similar challenges, potentially triggering a wave of regulatory recalibration. As users grow more informed and state governments grow more assertive, tech companies face a choice: adjust voluntary privacy guidelines, or contend with laws shaped by mounting public concern.
This isn’t just a legal clash—it’s a reflection of changing norms. The assumption that consumers will tolerate hidden surveillance in exchange for convenience no longer holds. Transparency isn’t a value-add anymore; it’s an expectation. And now, governments are beginning to enforce it.
Who gets to set the rules—corporations, legislators, or users? The outcome in Texas may reshape that balance. One state made its move. Others are watching.
We are here 24/7 to answer all of your TV + Internet Questions:
1-855-690-9884