Discussion about this post

User's avatar
Ms. Maine's avatar

I'm worried about what's happening with algorithms, and I'm not being dramatic; I'm looking at the actual policies. AND my granddaughter who is learning tech and her, our history.

Look, I know tech talk can be boring, but these executive orders aren't what they seem. They're reshaping what information we can access and how we understand our shared history. It's like someone's not just editing our photo albums but programming what pictures we can take tomorrow.

EO 14318? It's basically saying "forget the environment, we need more server farms!" (Think Starbucks, but for data—one on every corner.)

EO 14319 makes me laugh in that nervous way—suddenly, AI that acknowledges diverse perspectives is "biased," but AI that ignores them is "neutral." That's like calling vanilla "flavorless" and everything else "too spicy."

And EO 14320? It's about spreading this approach globally. It's like we're exporting a very specific American apple pie recipe and insisting it's the only way to bake.

I care about this because it affects all of us. When the systems running our schools, hospitals, and voting booths are trained on selective information, we all lose something precious, our ability to make truly informed choices.

We can do better! Start by asking questions about the tech you use. Where did it learn what it "knows"? Who decided what it should forget?

This matters because algorithms are becoming our shared memory. And just like I want my friends to call me out when I misremember something, I want our digital tools to reflect our full, complicated, beautiful reality.

Let's build technology that brings us together rather than divides us. The future is still ours to shape, and I believe we can create one where machines help us see more clearly, not less.

@Nadine Jones, GCSupport Thank you, Honeyeee—— the world needs your voice in this conversation.

1 more comment...

No posts

Ready for more?