A six-month investigation led by Pablo Torre Finds Out and Hunterbrook Media has uncovered troubling connections between a high-profile brainwave tech company and the Chinese Communist Party.
The company, BrainCo, was originally launched by Harvard and MIT scientists. Today, it operates out of China and has received funding from CCP-linked sources for nearly a decade.
Its signature product, the FocusCalm headset, is used by elite athletes like tennis star Jannik Sinner and Olympic skier Mikaela Shiffrin to boost performance by monitoring brainwave activity. The technology is marketed as a cognitive enhancer. But behind the marketing is a deeper geopolitical concern.
Despite its American roots, BrainCo has gradually shifted its operations to China — raising serious questions about where the brain data of U.S. athletes, schoolchildren, and others may ultimately end up.
Analysts warn that the Chinese government could gain access to this data, potentially using it for surveillance, military training, or intelligence operations. In effect, a company born on U.S. soil may now be giving Beijing a strategic edge.
WATCH:
The company has grown quietly but significantly in influence. What started as a promising neuroscience project may now be one of the most geopolitically sensitive tech firms in the world — one that appears to benefit China disproportionately and perhaps exclusively.
Broader Implications in the U.S.-China Tech Rivalry
BrainCo’s technology has applications well beyond sports. It’s also used in education, robotics, health care, and reportedly even military training.
This raises concerns reminiscent of a 2019 controversy in China, where public schools ran pilot programs using similar headsets. The trials were halted after widespread backlash over student privacy.
At the heart of the issue is the brainwave data itself: deeply personal, extremely sensitive, and increasingly valuable.
Key Risks of Brainwave Monitoring
Experts are raising red flags across multiple fronts, from privacy to ethics to national security.
1. Data Privacy and Consent
- Personal Exposure: Brainwave data can reveal thoughts, emotions, and mental states. Unauthorized access could expose users in ways never before possible.
- Informed Consent: Individuals — especially students or athletes — may not fully grasp what they’re giving up. There’s a risk of silent coercion in high-pressure environments.
2. Foreign Government Access
- Third-Party Exposure: If the data is processed or stored in China, there’s a legitimate concern that it could be accessed by CCP-affiliated entities.
- Surveillance Risk: The possibility of data being used for psychological profiling or behavioral control is no longer science fiction — it’s a real policy concern.
3. Cybersecurity and Manipulation
- Hacking Threats: Brain-computer interfaces (BCIs) are vulnerable to cyberattacks. Hackers could intercept or manipulate neural data.
- Data Integrity: Altered brain data could lead to flawed diagnostics, unsafe training environments, or manipulated performance outcomes.
4. Legal and Ethical Gaps
- Lack of Regulation: Most countries have no clear laws governing brain data. Colorado passed a neurodata privacy law in 2024, but it remains an exception.
- Autonomy at Risk: Without regulation, users could be exploited — especially minors or employees in monitored workplaces.
5. Psychological and Physical Health
- Mental Strain: Knowing your brain is being monitored may cause anxiety or loss of mental freedom.
- Medical Risks: Invasive neurotech, such as implants, carries physical health risks including infection or cognitive side effects.
6. Commercial and Political Exploitation
- Behavioral Targeting: Companies could use brain data to influence consumer decisions, or worse — manipulate them.
- Workplace Surveillance: The technology could usher in a dystopian shift where employers track productivity by reading minds.
A Strategic Threat
If China is indeed gaining access to brainwave data through firms like BrainCo, the national security implications are hard to ignore.
Analysts believe such data could be used to refine military training, behavioral modeling, or psychological warfare capabilities.
As the U.S. struggles to balance innovation with security, this case highlights the urgent need for stronger safeguards on emerging neurotechnologies — before the brain becomes just another battlefield.
