How Elon Musk’s X Update May Be a Global Privacy Nightmare, especially for Women or Vulnerable People
It seems to me that Elon Musk is shooting himself in the foot by undermining the privacy issues of digital citizens on his X baby. His recent decisions baffle me as they threaten the very fabric of privacy for journalists and pose serious risks for vulnerable individuals like women, the elderly, and teenagers. It’s a perplexing move that raises alarms about safety and common sense. I also see this as a public health concern.
I decided to write this short post today after reading an eye-opening article titled “Elon Musk risks privacy backlash by permitting accounts you’ve blocked on X to see your posts”. It was authored by Sofia Elizabella Wyciślik-Wilson on Betanews. The article resonated deeply with me. I wrote about it as it has a societal impact and implications for vulnerable people. It is not just a social media buzz if we focus on the real issue here.
You may ask what is the real issue. In my opinion, the real issue here revolves around privacy and safety concerns resulting from the changes Elon Musk plans to make to the “block” function on X (formerly Twitter). The key change is that blocked accounts will still be able to view your posts, even though they won’t be able to interact with you or your content directly.
I prepared a comprehensive podcast for you using the Google Notebook LM tool to make it easy to digest this simple yet high-impact issue. You can listen to the podcast titled Why I Reckon Elon Musk Is Shooting Himself in the Foot, which was published on Substack.com

And you can read my story about Elon’s unusual thoughts on Medium.com.
Is Elon Musk Undermining the Privacy of Digital Citizens with His New X Feature?
How Elon Musk’s X Update on the Blocking Feature May Be a Global Privacy Nightmare, Especially for Women or Vulnerable People

Podcast Script: The Impact of Elon Musk’s Changes at X on User Privacy
[Intro] Host 1: It seems like every day there’s a new headline about Elon Musk and all the changes he’s making over at X, formerly Twitter. A lot of folks are worried, and rightfully so. We’ve seen your messages, and one of the biggest concerns you’ve raised is about their new blocking policy. So today, we’re doing a deep dive on that.
Host 2: We’re going to try to cut through the noise and really understand what’s going on here.
[Insightful Thoughts]
Host 1: Absolutely. And we’re lucky to have some really insightful thoughts from Dr. Michael Brodley on this.
Host 2: Right. Dr. Brodley—he’s not just some tech guy; he’s a retired health scientist.
Host 1: Yeah. And what I find so interesting about Dr. Brodley is that he actually considers himself a late adopter to the whole digital world. He talks about how he unfortunately fell victim to some online scams early on.
Host 2: Yeah. But that experience really fueled his passion for online safety advocacy.
Host 1: And I think that’s something a lot of people can relate to these days.
Host 2: Definitely. It sounds like that experience really shaped how he sees this whole X situation, right?
Host 1: Yeah. He doesn’t just see this as a tech issue; he actually frames it as a public health concern, which I think is such a powerful way of looking at it.
[Public Health Concern]
Host 2: A public health concern? What do you mean by that?
Host 1: Well, he draws this parallel to all the public health initiatives that have been so successful in creating safer physical spaces.
Host 2: Right, like seatbelts, crosswalks, things like that.
Host 1: Exactly! Or even regulations on food safety. Dr. Brodley argues that we need that same level of serious consideration for our digital spaces.
Host 2: I like that. Our digital spaces.
[The Blocking Issue]
Host 1: OK, so let’s talk about one of these digital spaces that’s causing a lot of anxiety right now: X, formerly known as Twitter. They’ve made this big change to their blocking function, and people are not happy.
Host 2: And for good reason. I mean, it used to be that blocking someone on there actually gave you some distance, some sense of separation.
Host 1: Right. You could block someone and, you know, basically not have to deal with them anymore. Or at least that was the idea, I guess.
Host 2: Exactly. But now? Not so much.
Host 1: No, not so much at all. Basically, from what I understand, even if you block someone on X now, they can still see your posts. They can still interact with them. Like, what’s the point?
Host 2: Right. It’s almost like putting up a fence around your property, but anyone you’re trying to keep out can still peer in, you know? Maybe even shout a comment or two over the fence.
Host 1: Yeah. It kind of defeats the purpose of a fence, doesn’t it?
Host 2: It does. And that’s exactly Dr. Brodley’s point. He argues that this really undermines the whole idea behind blocking. It gives people this false sense of security, especially for those who rely on it to protect themselves from harassment or even stalking.
Host 1: Yeah, that’s what I was thinking. Like, this isn’t just some abstract online squabble we’re talking about, right?
Host 2: Exactly. This has real-world implications. Think about it—let’s say you’re a journalist who covers some pretty sensitive topics. Maybe you’ve even received threats online because of your work.
Host 1: That’s scary.
Host 2: It is. Or let’s say you’re dealing with online stalking, where someone is using your posts to track your whereabouts. This change means those people could still potentially see your every move, even if you block them.
Host 1: Wow. That’s honestly a little terrifying for anyone to think about. It’s not just online disagreements anymore. We’re talking about safety—real-world security.
Host 2: Exactly. And it’s not even the first time this has happened.
Host 1: Really?
Host 2: Yeah. Dr. Brodley points out that similar changes were attempted in Twitter’s past.
Host 1: Interesting.
Host 2: What’s fascinating is this whole idea of what he calls social media backlash. Have you heard of that?
Host 1: Yeah. I think so. But remind me.
Host 2: So basically, platforms kind of test the waters, right? They try out a new feature, a new policy. And if there’s enough public outcry, sometimes they actually reverse course.
Host 1: Really?
Host 2: Yeah. It’s almost like they’re poking the bear a little bit just to see how much they can get away with.
Host 1: Wow. That’s kind of… I don’t know about that strategy, but are there actual examples of this working?
Host 2: Oh, absolutely! Remember that whole thing with Instagram where they were going to make everyone’s feed completely algorithmic?
Host 1: Oh yeah, yeah. People were not happy about that.
Host 2: Exactly! And what happened? They backed down. Public pressure made them reconsider.
Host 1: Wow. That’s the power of a good social media backlash.
Host 2: Another example: WhatsApp tried to push through that privacy policy update, but users fled to other platforms. WhatsApp ended up having to do a complete U-turn.
Host 1: Interesting. OK, so there’s hope.
Host 2: There is hope. But knowing how to actually push back effectively—that’s the key. And it can feel overwhelming. Right? These issues just seem to pop up everywhere. What can we as individuals really do about it?
Host 1: Well, good news is Dr. Brodley doesn’t leave us hanging. He actually offers some concrete steps we can take.
Host 2: Oh, he does? This is where his background, I think, really shines through. He’s all about empowering people to take control of the situation, just like you would with your own health.
Host 1: I like that. So what’s the first step? What’s the prescription for a healthier digital life?
Host 2: Well, his first recommendation is so simple that it’s easy to overlook.
Host 1: Let me guess: Check your privacy settings?
Host 2: Bingo! You got it. It’s like, “OK, yeah, duh.” But you’re right; it’s so easy to just click through those things.
Host 1: Exactly. We just go with the default settings. Honestly, most of us have no idea what we’ve even agreed to.
Host 2: So step one: really understand those X privacy settings. And honestly, probably a good idea to do that for every social media platform, not just X, right?
Host 1: Absolutely! Think of it like making sure your doors are locked and your security system is armed, but for your digital life.
Host 2: Love that analogy. OK, so we’ve locked down our own accounts. But what about the bigger picture? That collective action we were talking about earlier?
Host 1: Right. Because we can’t just put up walls and hide. Dr. Brodley’s next suggestion is to support those organizations that are already out there fighting the good fight.
Host 2: OK. Yeah. Like which ones? Give us some names.
Host 1: Sure! Groups like the Electronic Frontier Foundation, Access Now, and the Center for Democracy and Technology.
Host 2: OK, those sound familiar, but I’ll be honest: it’s easy to get lost in the sea of organizations out there. How do we know which ones are actually effective and align with our values?
Host 1: That’s a really good point. One thing you can do is look at their track record. Have they been involved in successful campaigns before? Do they have real expertise in the areas you care about most, whether it’s online harassment or data privacy?
Host 2: OK, yeah. Do your research and check the credentials. Makes sense. What else?
Host 1: Well, this last one might seem kind of obvious, but Dr. Brodley is a big believer in the power of just making your voice heard.
Host 2: OK, so speak up!
Host 1: Exactly! Those platforms might seem untouchable, but they do pay attention when there’s enough noise being made. So don’t underestimate the power of a well-crafted tweet, a thoughtful email. You can even contact your elected officials and tell them this is important to you. Remember that Instagram situation we were talking about?
Host 2: Yeah.
Host 1: That was public pressure! That’s what made them change their minds. So you’re not just shouting into the void here; you can actually make a difference.
Here is an important and eye-opening tweet for your consideration:



Leave a Reply