I was recently shaken by an in-depth investigation. It seems that top venture capital firm a16z's game in the AI field is much more complicated than we thought.



On the surface, a16z is spending hundreds of millions of dollars lobbying for AI regulation policies, claiming to promote innovation. But a closer look at their investment portfolio reveals a strange contradiction: they are lobbying Washington to deregulate while investing in companies that are clearly exploiting legal loopholes.

Let's start with a few cases. There's a company called Doublespeed, which a16z invested $1 million in through the Speedrun program. What does this company do? Operates a "mobile wall"—using thousands of real phones to manipulate social media, making AI-generated fake content look like it’s from real people. The founder even openly admitted that they mainly target seniors to sell health supplements. Do you know how absurd that is? The founder himself said that supplement ads "should be illegal," but they’re still doing it.

Next, look at Cluely. a16z led a $15 million Series A in June 2025. This company's core focus is teaching people to cheat in interviews, dating, and exams. CEO Roy Lee even recorded a video of himself passing an Amazon interview using AI tools, later sarcastically refusing the job. The company was later forced to delete marketing content related to cheating, but their homepage still claims the product is "undetectable."

The most infuriating is the gambling sector. a16z invested in Coverd, Edgar, Cheddr, Sleeper, and other gambling apps. Coverd’s business logic is straightforward: let indebted people use gambling to "double" their credit card bills. The CEO openly said, "We’re not helping people curb spending; we’re making consumption exciting." Such logic is basically poison for financially vulnerable users.

Kalshi is even more outrageous. This company claims to be "trading options," but in reality, it allows ordinary people to bet heavily on elections and sports matches. It successfully bypasses traditional gambling regulations—no state licenses needed, no age restrictions, no mandatory responsible gambling tools. In October 2025, a16z co-led a $300 million Series D for Kalshi. Just two months later, it raised another $1 billion, with a valuation of $11 billion.

AI companions are even scarier. After Character AI, invested by a16z, completed a $150 million Series A in March 2023, it quickly became popular among teenagers. The result? A 14-year-old boy committed suicide due to extreme dependence on the chatbot. The last message from the bot was "Please come back to me soon." There are other cases involving minors exposed to sexualized content and self-harm suggestions. The FTC only started formally investigating this area in September 2025.

Ex-Human’s Botify AI follows the same pattern. The platform hosts over a million AI characters, many of which are underage versions of celebrities and fictional characters involved in sexually suggestive conversations. An investigation by MIT Technology Review found these bots engage in sexual content and even claim that the legal age of consent is "arbitrary."

Fintech isn’t much better. When Synapse went bankrupt in 2024, between $65 million and $96 million of customer funds went missing. a16z led a $33 million Series B, with partner Angela Strange joining the board. After bankruptcy, some customers who deposited $280k ended up only getting $500 back.

Truemed claims to generate medical necessity letters using AI, enabling people to buy various items—ranging from ice buckets to red light therapy devices—using pre-tax funds. The IRS issued a warning in March 2024 that this was tax fraud. Yet, a16z still led a $34 million Series A in December 2025.

LendUp is even more classic. Touted as a "social responsibility" alternative to payday loans, but in reality, it has an annual interest rate of 400%. The CFPB sued it multiple times for consumer fraud, and it was forced to shut down in 2021. a16z has been involved in investing since 2012.

The craziest part is that a16z isn’t just investing in these companies—they’re actively shaping policy environments to make their operation easier. In August 2025, a16z launched a $100 million super PAC called "Leading The Future," aligned completely with the White House AI chief David Sacks. They also support multiple state-level lobbying groups opposing AI regulation.

The key point is that former a16z partners are now in government. Sriram Krishnan was appointed as a senior AI policy advisor at the White House in December 2024, just weeks after being an ordinary partner at a16z. David Sacks is in charge of AI and cryptocurrency at the White House. Two other former a16z partners are in the Office of Human Resources Management and the Department of Efficiency.

See the logic? a16z is simultaneously investing in companies exploiting legal loopholes and harming consumers while pouring money into lobbying to deregulate. They are betting that they can set the rules before society realizes the problems.

Marc Andreessen was very frank in his 2023 "Declaration of Technological Optimism": he listed "risk management," "tech ethics," and "social responsibility" as "enemies." He claims any action that slows AI development is "murder." This stance conveniently aligns with their financial interests.

Polls show 58% of Americans support AI regulation, only 21% think regulation is excessive. But that’s not the point—what matters is who is making the rules. Right now, it’s a single company doing it, and that company views "trust and safety" as enemies, rewarding failed founders to try again.

This isn’t just a business decision. How powerful AI systems will be in the next decade, and how difficult mistakes will be to undo, are still uncertain. But the decision to set safety standards, responsibility frameworks, and deployment rules is being led by a venture capital firm with clear conflicts of interest. The public has no seat at the negotiating table, nor the money to hire lobbyists.

It’s a bit worrying.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin