Tiny pill cameras look for cancer in the digestive system / Humans + Tech - #71
+ How Facebook got addicted to spreading misinformation + Popping the filter bubble + Other interesting articles
Hi,
I hope you had a good week. The first article this week shows once again that, with time, science fiction becomes science.
Thousands of patients are swallowing tiny pill cameras to look for cancer
In the UK, 11,000 NHS patients with symptoms of bowel cancer are receiving a special drug capsule with a tiny camera inside it. As the cameras pass through the body’s digestive system, they take pictures twice per second to check for signs of cancer and other conditions like Crohn’s disease [Victor Tangermann, Futurism].
The capsules, called PillCams, are easily swallowed and are far less invasive than other cancer screening methods, according to the researchers behind the capsule. The cameras inside each pill can take two pictures per second as they travel through the entire digestive system. The images are stored in a data recorder that the patient carries around with them.
“Every year in England, we diagnose around 42,000 people with bowel cancer, that’s more than 100 people a day,” NHS clinical director for cancer Peter Johnson told Sky News. “We think that this camera test might be a better option than waiting for a normal colonoscopy.”
One patient described the process as smooth, comfortable and literally pain-free. It’s fascinating that in just under two decades, we’ve gone from cameras on mobile phones being a novelty to cameras so tiny, they can fit inside a pill and take photos of our insides.
How Facebook got addicted to spreading misinformation
After nine months of intense interviewing, researching, and reporting, Karne Hao of MIT Technology Review, released this in-depth article about how Facebook’s addiction to growth by focussing on maximising engagement using AI has led to Facebook being a primary engine for spreading misinformation and hate speech [Karen Hao, MIT Technology Review].
By the time thousands of rioters stormed the US Capitol in January, organized in part on Facebook and fueled by the lies about a stolen election that had fanned out across the platform, it was clear from my conversations that the Responsible AI team had failed to make headway against misinformation and hate speech because it had never made those problems its main focus. More important, I realized, if it tried to, it would be set up for failure.
The reason is simple. Everything the company does and chooses not to do flows from a single motivation: Zuckerberg’s relentless desire for growth. Quiñonero’s AI expertise supercharged that growth. His team got pigeonholed into targeting AI bias, as I learned in my reporting, because preventing such bias helps the company avoid proposed regulation that might, if passed, hamper that growth. Facebook leadership has also repeatedly weakened or halted many initiatives meant to clean up misinformation on the platform because doing so would undermine that growth.
In other words, the Responsible AI team’s work—whatever its merits on the specific problem of tackling AI bias—is essentially irrelevant to fixing the bigger problems of misinformation, extremism, and political polarization. And it’s all of us who pay the price.
This is a must-read article, touching upon various events like the Cambridge Analytica scandal to the genocide in Myanmar to being late to taking action against Holocaust deniers, anti-vaxxers, and QAnon. Hao’s primary source was several video calls with Joaquin Quiñonero Candela, a director of AI at Facebook.
Popping the filter bubble
Julia Angwin, editor in chief of The Markup, interviewed Eli Pariser, author of the 2011 book “The Filter Bubble,” a term he coined to describe the personalized ecosystem of information created by the algorithms of Big Tech [Julia Angwin, The Markup].
The Markup, inspired by Eli Pariser’s work, recently released Split Screen, a tool that shows how Facebook’s newsfeeds are personalized for different groups such as Biden voters vs Trump voters, Women vs Men, and Millenials vs Boomers.
The interview is fascinating, especially to learn about Pariser’s views after 10 years of writing his book and his idea of a digital public park to get strangers to behave well together, online. Here’s one question from the interview:
Angwin: You created the term “filter bubble” 10 years ago. Can you talk about what your insight was at the time?
Pariser: My interest really started with trying to understand how communication was changing, and how the way the information flowed was changing with the rise of platforms like Facebook and Google. I had this moment where I realized, “Oh, they’re all going to be powered by personal data and trying to reflect back what they think we’re most likely to click on or engage with.”
When all of a sudden those are a lot of the primary places that everybody’s getting information, you start to imagine this personal universe of information that’s generated by all these different algorithms, just for you or for who they think you are. And that was the filter bubble.
And so what does that mean for democracy? I was worried that it would be harder and harder to kind of live in a shared information universe or even really have a shared reference point of how far out you were.
It’s like everyone’s always lived in their own information universe in a way, but you can kind of see like, O.K., am I one standard deviation from the mean or five or 10, you know? And I guess part of my concern with personalization is like, You don’t even really have that reference point.
You don’t know how weird your bubble is in reference to anyone else’s because you can’t actually see it.
Other interesting articles from around the web
🍟 Drive-throughs that predict your order? Restaurants are thinking fast [Julie Creswell, The New York Times]
As Covid has increased drive-through traffic, restaurants are looking at new ways they can increase efficiency.
But this year, Burger King is testing a Bluetooth technology that will be able to identify customers in Burger King’s loyalty program and show their previous orders. If a customer ordered a small Sprite and a Whopper with cheese, hold the pickles, the last three visits, Deep Flame will calculate that chances are high that the customer will want the same order again.
🖼 An NFT just sold for $69 million at Christie’s and was paid for in cryptocurrency [Connie Lin, Fast Company]
Beeple, a digital artist, has created one piece of digital artwork every day since 2007. He combined the first 5000 pieces into one JPEG called “Everydays—The First 5000 Days.” The artwork was the first-ever digital piece to be auctioned at Christie’s and fetched over $69 million. It was sold as an NFT (nonfungible token), which certifies authenticity and ownership of digital assets through a blockchain.
+ Jack Dorsey is auctioning his first tweet as an NFT [Jay Peters, The Verge]. The current bid is at $2.5 million. He will donate the proceeds to charity. Interested? You have until March 21 to up the bid.
Curious to learn more about NFTs and how they work? If so, let me know, and I’ll be happy to a deep-dive into NFTs in an upcoming newsletter.
💰 For Creators, Everything Is for Sale [Taylor Lorenz, The New York Times]
As the market for digital stars and influencers gets more competitive, they are giving control of different aspects of their lives and day-to-day decisions to their fans to monetize more parts of their lives.
For example, a creator can use NewNew to post a poll asking which sweater they should wear today, or who they should hang out with and where they should go. Fans purchase voting power on NewNew’s platform to participate in the polls, and with enough voting power, they get to watch their favorite influencer live out their wishes, like a real life choose-your-own-adventure game.
“Creators are burning out, but their fans want more and more,” said Jen Lee, 25, the founder of a popular creator economy community on Discord. “By monetizing each aspect of their life, they can extract value from everyday interactions.”
Quote of the week
“When you’re in the business of maximizing engagement, you’re not interested in truth. You’re not interested in harm, divisiveness, conspiracy. In fact, those are your friends.”
—Hany Farid, a professor at the University of California, Berkeley, from the article “How Facebook got addicted to spreading misinformation” [MIT Technology Review]
I wish you a brilliant day ahead :)
Neeraj