💻 Meta Releases AI Model of the Human Brain
Meta has introduced TRIBE v2—an open model that predicts how the human brain will respond to text, sound, or video. Essentially, it's a digital perception model: you give the system a stimulus—like a video clip or a phrase—and it predicts which areas of the cerebral cortex will be activated.
TRIBE v2 was trained on more than 1,000 hours of MRI recordings from over 720 people. The AI can create an averaged model of a brain, even for people it hasn't "seen" during training. And if you fine-tune it with just one hour of MRI data from a specific person, it can generate a "personal snapshot" of their brain activity.
🔍 According to the developers, the model can predict activity in about 20,000 points of the cortex. The correlation between the model's predictions and real brain behavior was around 0.21, which is actually not bad.
The thing is, functional MRI records brain activity indirectly, by measuring changes in blood flow. As a result, there's inevitably a lot of noise in the signal due to blood movement, patient breathing, and other factors, so the correlation between actual brain dynamics and the MRI picture is never 100%.
💡 Of course, we're still far from true "mind reading," but this is already a significant step toward better understanding how the brain processes what it sees, hears, and reads.
@hiaimediaen

