Новая христианская сеть телефонной связи и отладка моделей машинного обучения (LLM)
MIT Tech Review AI
В США запускается новая сеть телефонной связи, ориентированная на христиан. Эта сеть будет блокировать контент, связанный с порнографией и гендерными вопросами, используя сетевые контроли, которые нельзя отключить.
Новая сеть телефонной связи для христиан
Новая сеть телефонной связи в США, ориентированная на христиан, запускается на следующей неделе. Она будет использовать сетевые контроли для блокировки порнографии, которые не могут быть отключены, даже владельцами взрослых аккаунтов.
Также будет введен фильтр на контент, связанный с гендерными вопросами, который будет включен по умолчанию для всех тарифных планов.
Отладка моделей машинного обучения
Стартап Goodfire из Сан-Франциско выпустил новый инструмент под названием Silico, который позволяет исследователям заглянуть внутрь модели ИИ и отрегулировать ее параметры во время обучения.
Этот инструмент может дать пользователям больше контроля над тем, как строится эта технология, чем считалось возможным ранее.
Другие новости
Китайские лаборатории ИИ выпускают открытые модели, которые разработчики могут скачать, адаптировать и запустить на своем оборудовании.
Этот подход стал мейнстримом после того, как DeepSeek открыла свой модель R1, который соответствует лучшим американским системам по доле стоимости.
Модель ИИ от OpenAI превзошла врачей скорой помощи в диагностике пациентов, анализируя данные медицинских записей и информацию, предоставленную врачам.
The Download: a new Christian phone network, and debugging LLMs
MIT Tech Review AI
This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.
A new US phone network for Christians aims to block porn and gender-related content
A new US-wide cell phone network marketed to Christians is set to launch next week. It blocks porn using network-level controls that can’t be turned off—even by adult account owners.
It’s also rolling out a filter on sexual content aimed at blocking material related to gender and trans issues, optional but turned on by default across all plans.
The trouble is, many websites don’t fit neatly into one category. That leaves its maverick founder with broad, subjective control over what is allowed or banned. Read the full story.
—James O’Donnell
This startup’s new mechanistic interpretability tool lets you debug LLMs
The San Francisco–based startup Goodfire has released a new tool, Silico, that lets researchers peer inside an AI model and adjust its parameters during training. It could give users more control over how this technology is built than was once thought possible.
The goal is to make building AI models less like alchemy and more like a science. Using a technique called mechanistic interpretability, Silico maps the neurons and pathways inside a model and lets developers tweak them to reduce unwanted behaviors or steer outputs.
By exposing the “knobs and dials,” Goodfire hopes to bring AI training closer to traditional software engineering. Read the full story.
—Will Douglas Heaven
With mass firing, Trump deals a fresh blow to American science
This past week delivered another gut punch for science in the US. This time, the target was the National Science Foundation—a federal agency that funds major research projects to the tune of around $9 billion. On Friday, the 22 scientists overseeing those efforts were all fired.
Since 2025, the NSF has faced budget cuts, grant terminations, and mass firings, with staff numbers down sharply and many ambitious projects grinding to a halt. The result is a major shift in how American science is funded and governed. Discover what it means, and what’s next.
—Jessica Hamzelou
This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.
China’s open-source bet: 10 Things That Matter in AI Right Now
Silicon Valley AI companies follow a familiar playbook: keep the models behind an API and charge for access. China’s leading AI labs are playing a different game, releasing “open-weight” models that developers can download, adapt, and run on their own hardware.
That approach went mainstream after DeepSeek open-sourced its R1 model, which matched top US systems at a fraction of the cost. It also won something subtler: goodwill with developers. A growing cohort of Chinese labs is now following the same blueprint.
As AI shifts from hype to deployment, open-source models are making the future of AI more multipolar than Silicon Valley expected. Read the full story.
—Caiwei Chen
China’s open-source bet is one of the 10 Things That Matter in AI Right Now, our list of the biggest ideas, trends, and advances in AI today. We’re unpacking one item from the list each day here in The Download, so stay tuned.
The must-reads
I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.
1 Elon Musk has admitted that xAI trained Grok on OpenAI models “Distillation” is standard practice in AI, despite being legally dubious. (Wired $) + The White House has accused Chinese firms using distillation of theft. (BBC) + American labs are widely assumed to use similar techniques. (TechCrunch)
2A “de-extinction” startup wants to resurrect a long-lost antelope Colossal Biosciences wants to bring back the bluebuck. (Axios) + The company is using genomic editing to revive the animal. (Gizmodo) + It previously claimed to have cloned red wolves. (MIT Technology Review)
3 An OpenAI model outperformed ER doctors at diagnosing patients By analyzing health records data and information provided to physicians. (NPR) + But it still must be proven in real-world clinical trials. (Vox)
4Scientists are trying to power AI data centers with tiny nuclear reactors They could provide a new way to meet AI’s energy demands. (Gizmodo) + We did the math on AI’s energy footprint. (MIT Technology Review)
5Spotify has started verifying human artists A new badge will distinguish them from AI. (The Guardian) + Spotify has faced criticism for its handling of AI. (BBC)
6The US is backing a Congolese railway to break China’s grip on critical minerals The old railroad is key to the race for critical metals in Africa. (Rest of World) + The US is also searching for alternative sources. (MIT Technology Review)
7Huawei is set to overtake Nvidia in China’s AI chip market It’s expected to capture the largest market share this year. (FT $)
8Japan is building cardboard drones for the battlefield The flatpack designs are cheap, disposable, and built at scale. (404 Media)
9 The more young people use AI, the more they hate it Research shows that Gen Z doesn’t trust GenAI. (The Verge)
10 A new organoid can menstruate—and show how tissue repairs itself It’s revealing how the uterus can shed without scarring. (Nature)
Quote of the day
“I suspect that there are a number of people who do not want to put the future of humanity in Mr Musk’s hands.But we’re not going to get into that.”
—Judge Gonzalez Rogers rebukes attempts by Elon Musk’s lawyer to focus on AI’s existential risks as part of his lawsuit against OpenAI, the New York Times reports.
One More Thing
TMY350 VIA WIKIMEDIA COMMONS
This rare earth metal shows us the future of our planet’s resources
The materials we need to power our world are shifting from fossil fuels to energy sources that don’t produce greenhouse gas emissions.
Take neodymium, a rare earth metal used in powerful magnets that power everything from smartphones to wind turbines. Its story reveals many of the challenges we’ll likely face across the supply chain in the coming century and beyond.