11908 stories
·
23 followers

Analogue3D Review: A Retro Gamer's Dream

1 Share
This is the best way to play classic N64 games in 2025.
Read the whole story
freeAgent
2 hours ago
reply
Los Angeles, CA
Share this story
Delete

Is Your Android TV Streaming Box Part of a Botnet?

1 Comment

On the surface, the Superbox media streaming devices for sale at retailers like BestBuy and Walmart may seem like a steal: They offer unlimited access to more than 2,200 pay-per-view and streaming services like Netflix, ESPN and Hulu, all for a one-time fee of around $400. But security experts warn these TV boxes require intrusive software that forces the user’s network to relay Internet traffic for others, traffic that is often tied to cybercrime activity such as advertising fraud and account takeovers.

Superbox media streaming boxes for sale on Walmart.com.

Superbox bills itself as an affordable way for households to stream all of the television and movie content they could possibly want, without the hassle of monthly subscription fees — for a one-time payment of nearly $400.

“Tired of confusing cable bills and hidden fees?,” Superbox’s website asks in a recent blog post titled, “Cheap Cable TV for Low Income: Watch TV, No Monthly Bills.”

“Real cheap cable TV for low income solutions does exist,” the blog continues. “This guide breaks down the best alternatives to stop overpaying, from free over-the-air options to one-time purchase devices that eliminate monthly bills.”

Superbox claims that watching a stream of movies, TV shows, and sporting events won’t violate U.S. copyright law.

“SuperBox is just like any other Android TV box on the market, we can not control what software customers will use,” the company’s website maintains. “And you won’t encounter a law issue unless uploading, downloading, or broadcasting content to a large group.”

A blog post from the Superbox website.

There is nothing illegal about the sale or use of the Superbox itself, which can be used strictly as a way to stream content at providers where users already have a paid subscription. But that is not why people are shelling out $400 for these machines. The only way to watch those 2,200+ channels for free with a Superbox is to install several apps made for the device that enable them to stream this content.

Superbox’s homepage includes a prominent message stating the company does “not sell access to or preinstall any apps that bypass paywalls or provide access to unauthorized content.” The company explains that they merely provide the hardware, while customers choose which apps to install.

“We only sell the hardware device,” the notice states. “Customers must use official apps and licensed services; unauthorized use may violate copyright law.”

Superbox is technically correct here, except for maybe the part about how customers must use official apps and licensed services: Before the Superbox can stream those thousands of channels, users must configure the device to update itself, and the first step involves ripping out Google’s official Play store and replacing it with something called the “App Store” or “Blue TV Store.”

Superbox does this because the device does not use the official Google-certified Android TV system, and its apps will not load otherwise. Only after the Google Play store has been supplanted by this unofficial App Store do the various movie and video streaming apps that are built specifically for the Superbox appear available for download (again, outside of Google’s app ecosystem).

Experts say while these Android streaming boxes generally do what they advertise — enabling buyers to stream video content that would normally require a paid subscription — the apps that enable the streaming also ensnare the user’s Internet connection in a distributed residential proxy network that uses the devices to relay traffic from others.

Ashley is a senior solutions engineer at Censys, a cyber intelligence company that indexes Internet-connected devices, services and hosts. Ashley requested that only her first name be used in this story.

In a recent video interview, Ashley showed off several Superbox models that Censys was studying in the malware lab — including one purchased off the shelf at BestBuy.

“I’m sure a lot of people are thinking, ‘Hey, how bad could it be if it’s for sale at the big box stores?'” she said. “But the more I looked, things got weirder and weirder.”

Ashley said she found the Superbox devices immediately contacted a server at the Chinese instant messaging service Tencent QQ, as well as a residential proxy service called Grass IO.

GET GRASSED

Also known as getgrass[.]io, Grass says it is “a decentralized network that allows users to earn rewards by sharing their unused Internet bandwidth with AI labs and other companies.”

“Buyers seek unused internet bandwidth to access a more diverse range of IP addresses, which enables them to see certain websites from a retail perspective,” the Grass website explains. “By utilizing your unused internet bandwidth, they can conduct market research, or perform tasks like web scraping to train AI.” 

Reached via Twitter/X, Grass founder Andrej Radonjic told KrebsOnSecurity he’d never heard of a Superbox, and that Grass has no affiliation with the device maker.

“It looks like these boxes are distributing an unethical proxy network which people are using to try to take advantage of Grass,” Radonjic said. “The point of grass is to be an opt-in network. You download the grass app to monetize your unused bandwidth. There are tons of sketchy SDKs out there that hijack people’s bandwidth to help webscraping companies.”

Radonjic said Grass has implemented “a robust system to identify network abusers,” and that if it discovers anyone trying to misuse or circumvent its terms of service, the company takes steps to stop it and prevent those users from earning points or rewards.

Superbox’s parent company, Super Media Technology Company Ltd., lists its street address as a UPS store in Fountain Valley, Calif. The company did not respond to multiple inquiries.

According to this teardown by behindmlm.com, a blog that covers multi-level marketing (MLM) schemes, Grass’s compensation plan is built around “grass points,” which are earned through the use of the Grass app and through app usage by recruited affiliates. Affiliates can earn 5,000 grass points for clocking 100 hours usage of Grass’s app, but they must progress through ten affiliate tiers or ranks before they can redeem their grass points (presumably for some type of cryptocurrency). The 10th or “Titan” tier requires affiliates to accumulate a whopping 50 million grass points, or recruit at least 221 more affiliates.

Radonjic said Grass’s system has changed in recent months, and confirmed the company has a referral program where users can earn Grass Uptime Points by contributing their own bandwidth and/or by inviting other users to participate.

“Users are not required to participate in the referral program to earn Grass Uptime Points or to receive Grass Tokens,” Radonjic said. “Grass is in the process of phasing out the referral program and has introduced an updated Grass Points model.”

A review of the Terms and Conditions page for getgrass[.]io at the Wayback Machine shows Grass’s parent company has changed names at least five times in the course of its two-year existence. Searching the Wayback Machine on getgrass[.]io shows that in June 2023 Grass was owned by a company called Wynd Network. By March 2024, the owner was listed as Lower Tribeca Corp. in the Bahamas. By August 2024, Grass was controlled by a Half Space Labs Limited, and in November 2024 the company was owned by Grass OpCo (BVI) Ltd. Currently, the Grass website says its parent is just Grass OpCo Ltd (no BVI in the name).

Radonjic acknowledged that Grass has undergone “a handful of corporate clean-ups over the last couple of years,” but described them as administrative changes that had no operational impact. “These reflect normal early-stage restructuring as the project moved from initial development…into the current structure under the Grass Foundation,” he said.

UNBOXING

Censys’s Ashley said the phone home to China’s Tencent QQ instant messaging service was the first red flag with the Superbox devices she examined. She also discovered the streaming boxes included powerful network analysis and remote access tools, such as Tcpdump and Netcat.

“This thing DNS hijacked my router, did ARP poisoning to the point where things fall off the network so they can assume that IP, and attempted to bypass controls,” she said. “I have root on all of them now, and they actually have a folder called ‘secondstage.’ These devices also have Netcat and Tcpdump on them, and yet they are supposed to be streaming devices.”

A quick online search shows various Superbox models and many similar Android streaming devices for sale at a wide range of top retail destinations, including Amazon, BestBuy, Newegg, and Walmart. Newegg.com, for example, currently lists more than three dozen Superbox models. In all cases, the products are sold by third-party merchants on these platforms, but in many instances the fulfillment comes from the e-commerce platform itself.

“Newegg is pretty bad now with these devices,” Ashley said. “Ebay is the funniest, because they have Superbox in Spanish — the SuperCaja — which is very popular.”

Superbox devices for sale via Newegg.com.

Ashley said Amazon recently cracked down on Android streaming devices branded as Superbox, but that those listings can still be found under the more generic title “modem and router combo” (which may be slightly closer to the truth about the device’s behavior).

Superbox doesn’t advertise its products in the conventional sense. Rather, it seems to rely on lesser-known influencers on places like Youtube and TikTok to promote the devices. Meanwhile, Ashley said, Superbox pays those influencers 50 percent of the value of each device they sell.

“It’s weird to me because influencer marketing usually caps compensation at 15 percent, and it means they don’t care about the money,” she said. “This is about building their network.”

A TikTok influencer casually mentions and promotes Superbox while chatting with her followers over a glass of wine.

BADBOX

As plentiful as the Superbox is on e-commerce sites, it is just one brand in an ocean of no-name Android-based TV boxes available to consumers. While these devices generally do provide buyers with “free” streaming content, they also tend to include factory-installed malware or require the installation of third-party apps that engage the user’s Internet address in advertising fraud.

In July 2025, Google filed a “John Doe” lawsuit (PDF) against 25 unidentified defendants dubbed the “BadBox 2.0 Enterprise,” which Google described as a botnet of over ten million Android streaming devices that engaged in advertising fraud. Google said the BADBOX 2.0 botnet, in addition to compromising multiple types of devices prior to purchase, can also infect devices by requiring the download of malicious apps from unofficial marketplaces.

Some of the unofficial Android devices flagged by Google as part of the Badbox 2.0 botnet are still widely for sale at major e-commerce vendors. Image: Google.

Several of the Android streaming devices flagged in Google’s lawsuit are still for sale on top U.S. retail sites. For example, searching for the “X88Pro 10” and the “T95” Android streaming boxes finds both continue to be peddled by Amazon sellers.

Google’s lawsuit came on the heels of a June 2025 advisory from the Federal Bureau of Investigation (FBI), which warned that cyber criminals were gaining unauthorized access to home networks by either configuring the products with malicious software prior to the user’s purchase, or infecting the device as it downloads required applications that contain backdoors, usually during the set-up process.

“Once these compromised IoT devices are connected to home networks, the infected devices are susceptible to becoming part of the BADBOX 2.0 botnet and residential proxy services known to be used for malicious activity,” the FBI said.

The FBI said BADBOX 2.0 was discovered after the original BADBOX campaign was disrupted in 2024. The original BADBOX was identified in 2023, and primarily consisted of Android operating system devices that were compromised with backdoor malware prior to purchase.

Riley Kilmer is founder of Spur, a company that tracks residential proxy networks. Kilmer said Badbox 2.0 was used as a distribution platform for IPidea, a China-based entity that is now the world’s largest residential proxy network.

Kilmer and others say IPidea is merely a rebrand of 911S5 Proxy, a China-based proxy provider sanctioned last year by the U.S. Department of the Treasury for operating a botnet that helped criminals steal billions of dollars from financial institutions, credit card issuers, and federal lending programs (the U.S. Department of Justice also arrested the alleged owner of 911S5).

How are most IPidea customers using the proxy service? According to the proxy detection service Synthient, six of the top ten destinations for IPidea proxies involved traffic that has been linked to either ad fraud or credential stuffing (account takeover attempts).

Kilmer said companies like Grass are probably being truthful when they say that some of their customers are companies performing web scraping to train artificial intelligence efforts, because a great deal of content scraping which ultimately benefits AI companies is now leveraging these proxy networks to further obfuscate their aggressive data-slurping activity. By routing this unwelcome traffic through residential IP addresses, Kilmer said, content scraping firms can make it far trickier to filter out.

“Web crawling and scraping has always been a thing, but AI made it like a commodity, data that had to be collected,” Kilmer told KrebsOnSecurity. “Everybody wanted to monetize their own data pots, and how they monetize that is different across the board.”

SOME FRIENDLY ADVICE

Products like Superbox are drawing increased interest from consumers as more popular network television shows and sportscasts migrate to subscription streaming services, and as people begin to realize they’re spending as much or more on streaming services than they previously paid for cable or satellite TV.

These streaming devices from no-name technology vendors are another example of the maxim, “If something is free, you are the product,” meaning the company is making money by selling access to and/or information about its users and their data.

Superbox owners might counter, “Free? I paid $400 for that device!” But remember: Just because you paid a lot for something doesn’t mean you are done paying for it, or that somehow you are the only one who might be worse off from the transaction.

It may be that many Superbox customers don’t care if someone uses their Internet connection to tunnel traffic for ad fraud and account takeovers; for them, it beats paying for multiple streaming services each month. My guess, however, is that quite a few people who buy (or are gifted) these products have little understanding of the bargain they’re making when they plug them into an Internet router.

Superbox performs some serious linguistic gymnastics to claim its products don’t violate copyright laws, and that its customers alone are responsible for understanding and observing any local laws on the matter. However, buyer beware: If you’re a resident of the United States, you should know that using these devices for unauthorized streaming violates the Digital Millennium Copyright Act (DMCA), and can incur legal action, fines, and potential warnings and/or suspension of service by your Internet service provider.

According to the FBI, there are several signs to look for that may indicate a streaming device you own is malicious, including:

-The presence of suspicious marketplaces where apps are downloaded.
-Requiring Google Play Protect settings to be disabled.
-Generic TV streaming devices advertised as unlocked or capable of accessing free content.
-IoT devices advertised from unrecognizable brands.
-Android devices that are not Play Protect certified.
-Unexplained or suspicious Internet traffic.

This explainer from the Electronic Frontier Foundation delves a bit deeper into each of the potential symptoms listed above.

Read the whole story
freeAgent
2 hours ago
reply
It's crazy that major, big box retailers sell this stuff.
Los Angeles, CA
Share this story
Delete

CPSC warns Rad Power Bikes owners to stop using select batteries immediately due to fire risk

1 Comment

In an unprecedented move, the US Consumer Product Safety Commission (CPSC) has issued a public safety warning urging owners of certain Rad Power Bikes e-bike batteries to immediately stop using them, citing a risk of fire, explosion, and potentially serious injury or death.

more…
Read the whole story
freeAgent
3 hours ago
reply
Rad Power is probably going away very soon.
Los Angeles, CA
Share this story
Delete

America’s Polarization Has Become the World's Side Hustle

1 Share

A new feature on X is making people suddenly realize that some large portion of the divisive, hateful, and spammy content designed to inflame tensions or, at the very least, is designed to get lots of engagement on social media, is being published by accounts that are pretending to be based in the United States but are actually being run by people in countries like Bangladesh, Vietnam, India, Cambodia, Russia, and other countries. An account called “Ivanka News” is based in Nigeria, “RedPilledNurse” is from Europe, “MAGA Nadine” is in Morocco, “Native American Soul” is in Bangladesh, and “Barron Trump News” is based in Macedonia, among many, many of others. 

Inauthentic viral accounts on X are just the tip of the iceberg, though, as we have reported. A huge amount of the viral content about American politics and American news on social media is from sock puppet and bot accounts monetized by people in other countries. The rise of easy to use, free AI generative tools have supercharged this effort, and social media monetization programs have incentivized this effort and are almost entirely to blame. The current disinformation and slop phenomenon on the internet today makes the days of ‘Russian bot farms’ and ‘fake news pages from Cyprus’ seem quaint; the problem is now fully decentralized and distributed across the world and is almost entirely funded by social media companies themselves. 

This will not be news to people who have been following 404 Media, because I have done multiple investigations about the perverse incentives that social media and AI companies have created to incentivize people to fill their platforms with slop. But what has happened on X is the same thing that has happened on Facebook, Instagram, YouTube, and other social media platforms (it is also happening to the internet as a whole, with AI slop websites laden with plagiarized content and SEO spam and monetized with Google ads). Each social media platform has either an ad revenue sharing program, a “creator bonus” program, or a monetization program that directly pays creators who go viral on their platforms

This has created an ecosystem of side hustlers trying to gain access to these programs and YouTube and Instagram creators teaching people how to gain access to them. It is possible to find these guide videos easily if you search for things like “monetized X account” on YouTube. Translating that phrase and searching in other languages (such as Hindi, Portuguese, Vietnamese, etc) will bring up guides in those languages. Within seconds, I was able to find a handful of YouTubers explaining in Hindi how to create monetized X accounts; other videos on the creators’ pages explain how to fill these accounts with AI-generated content. These guides also exist in English, and it is increasingly popular to sell guides to make “AI influencers,” and AI newsletters, Reels accounts, and TikTok accounts regardless of the country that you’re from. 

Examples include “AK Educate” (which is one of thousands), which posts every few days about how to monetize accounts on Facebook, YouTube, X, Instagram, TikTok, Etsy, and others. “How to create Twitter X Account for Monitization [sic] | Earn From Twitter in Pakistan,” is the name of a typical video in this genre. These channels are not just teaching people how to make and spam content, however. They are teaching people specifically how to make it seem like they are located in the United States, and how to create content that they believe will perform with American audiences on American social media. Sometimes they are advising the use of VPNs and other tactics to make it seem like the account is posting from the United States, but many of the accounts explain that doing this step doesn’t actually matter.

Americans are being targeted because advertisers pay higher ad rates to reach American internet users, who are among the wealthiest in the world. In turn, social media companies pay more money if the people engaging with the content are American. This has created a system where it makes financial sense for people from the entire world to specifically target Americans with highly engaging, divisive content. It pays more. 

For the most part, the only ‘psyop’ here is one being run on social media users by social media companies themselves in search of getting more ad revenue by any means necessary. 

For example: AK Educate has a video called “7 USA Faceless Channel Ideas for 2025,” and another video called “USA YouTube Channel Kaise Banaye [how to].” The first of these videos is in Hindi but has English subtitles.

“Where you get $1 on 1,000 views on Pakistani content,” the video begins, “you get $5 to $7 on 1,000 views on USA content.” 

“As cricket is seen in Pakistan and India, boxing and MMA are widely seen in America,” he says. Channel ideas include “MMA,” “Who Died Today USA,” “How ships sink,” news from wars, motivational videos, and Reddit story voiceovers. To show you how pervasive this advice to make channels that target Americans is, look at this, which is a YouTube search for “USA Channel Kaise Banaye”:

0:00
/0:23
Screengrabs from YouTube videos about how to target Americans

One of these videos, called “7 Secret USA-Based Faceless Channel Ideas for 2026 (High RPM Niches!)” starts with an explanation of “USA currency,” which details what a dollar is and what a cent is, and its value relative to the rupee, and goes on to explain how to generate English-language content about ancient history, rare cars, and tech news. Another video I watched showed, from scratch, how to create videos for a channel called “Voices of Auntie Mae,” which are supposed to be inspirational videos about Black history that are generated using a mix of ChatGPT, Google Translate, an AI voice tool called Speechma, Google’s AI image generator, CapCut, and YouTube. Another shows how to use Bing search, Google News Trends, Perplexity, and video generators to create “a USA Global News Channel Covering World Events,” which included making videos about the war in Ukraine and Chinese military parades. A video podcast about success stories included how a man made a baseball video called “baseball Tag of the year??? #mlb” in which 49 percent of viewers were in the USA: “People from the USA watch those types of videos, so my brother sitting at home in India easily takes his audience to an American audience,” one of the creators said in the video. 

I watched video after video being created by a channel called “Life in Rural Cambodia,” about how to create and spam AI-generated content using only your phone. Another video, presented by an AI-generated woman speaking Hindi, explains how it is possible to copy paste text from CNN to a Google Doc, run it through a program called “GravityWrite” to alter it slightly, have an AI voice read it, and post the resulting video to YouTube. 

A huge and growing amount of the content that we see on the internet is created explicitly because these monetization programs exist. People are making content specifically for Americans. They are not always, or even usually, creating it because they are trying to inflame tensions. They are making it because they can make money from it, and because content viewed by Americans pays the most and performs the best. The guides to making this sort of thing focus entirely on how to make content quickly, easily, and using automated tools. They focus on how to steal content from news outlets, source things from other websites, and generate scripts using AI tools. They do not focus on spreading disinformation or fucking up America, they focus on “making money.”  This is a problem that AI has drastically exacerbated, but it is a problem that has wholly been created by social media platforms themselves, and which they seem to have little or no interest in solving. 

The new feature on X that exposes this fact is notable because people are actually talking about it, but Facebook and YouTube have had similar features for years, and it has changed nothing. Clicking any random horrific Facebook slop page, such as this one called “City USA” which exclusively posts photos of celebrities holding birthday cakes, shows that even though it lists its address as being in New York City, the page is being run by someone in Cambodia. This page called “Military Aviation” which lists its address as “Washington DC,” is actually based in Indonesia. This page called “Modern Guardian” and which exclusively posts positive, fake AI content about Elon Musk, lists itself as being in Los Angeles but Facebook’s transparency tools say it is based in Cambodia. 

Besides journalists and people who feel like they are going crazy looking at this stuff, there are, realistically, no social media users who are going into the “transparency” pages of viral social media accounts to learn where they are based. The problem is not a lack of transparency, because being “transparent” doesn’t actually matter. The only thing revealed by this transparency is that social media companies do not give a fuck about this.



Read the whole story
freeAgent
3 hours ago
reply
Los Angeles, CA
Share this story
Delete

Aliens and the End of Philosophy

1 Comment
PERSON:
Read the whole story
freeAgent
3 hours ago
reply
Hegelians.
Los Angeles, CA
Share this story
Delete

Are the Entry-Level Jobs Drying Up for Young Adults?

1 Share

Several recent reports have noted that unemployment rates for recent college graduates have been on the rise, which in turn has led to speculation that perhaps the new AI tools are already leading to fewer opportunities for such workers. Alexander Cline and Barış Kaymak of the Cleveland Fed provide some useful nuance to the discussion of employment among college graduates as compared to high school grauduates in “Are Young College Graduates Losing Their Edge in the Job Market” (Economic Commentary, November 24, 2025), although they tread lightly (as is appropriate) about assigning causes.

The left hand figure shows that unemployment rates for those with only a high school degree are consistently higher than for those with a college degree. This data includes only young adults in the 22-27 age group. However, the size of the unemployment gap varies. Since the aftermath of the Great Recession back around 2010, the two rates have been generally converging (except during the pandemic). At present, the gap is the lowest it has been for a half-century.

A glass-half-full type would also point out that although the pattern over the last 15 years is striking, most of the convergence between these unemployment rates has happened because the unemployment rate for those with only a high school degree spiked so violently dufing the Great Recession. Also, although the gap between unemployment rates for the two groups is small, it was also quite low in the late 1970s, late 1980s, and late 1990s. Also, comparing unemployment rates for these two groups doesn’t take into account how share of those attending college has risen in the last half-century.

We can dig into these labor market patterns more deeply. There is something called the “unemployment exit rate,” which shows in a given month how many of the unemployed depart from that category. Again, the patterns show here are for the 22-27 age group. The figures use two different methods to calculate unemployment exit rates (details on method in the article). Here, the key point is the historical pattern was that young-adult college graduates were more likely to exit unemployment than high school graduate, but around 2019, this pattern flip-flopped. Since then, unemployed young adult high school graduates have become more likely to leave unemployment than the college graduates.

In theory, one could exit unemployment either by finding a job, or by just not looking for a job any more (in which case you are counted as “out of the labor force,” rather than unemployed). But the author show that most of the shift is that high school graduates have become more likely to find jobs.

The authors argue that labor demand in the US economy has in fact shifted: labor demand used to favor those with higher education levels, but now it has shifted to education-neutral growth in labor demand. They write:

Although the narrowing unemployment gap was noticed recently, the underlying factors that contribute to this trend have been operating for much longer. The prolonged jobless recovery after 2008 particularly affected high school graduates, obscuring the secular convergence of job-finding rates between college-educated and high-school-educated workers. It does not appear that recent developments are attributable to postpandemic factors alone. … Developments related to AI, which may be affecting job-finding prospects in some cases, cannot explain the decades-long decline in the college job-finding rate.

It remains true that when a young adult with a college degree does find a job, it pays better than when a young adult with only a high school degree does so. But one might say that young adult college graduates used to benefit from both a lower unemployment rate and higher wages, but the advantage of a lower unemployment rate has been declining, so now the main labor advantage is higher wages after finding a job.

The post Are the Entry-Level Jobs Drying Up for Young Adults? first appeared on Conversable Economist.

Read the whole story
freeAgent
3 hours ago
reply
Los Angeles, CA
Share this story
Delete
Next Page of Stories