The advent of AIs that think and write knowledgable essays applicable to varied contexts and projects… creates a NEW problem for authorship.
ChatGPT5.1 offered to do an essay about PSA’s response to the challenging ideas in Shapiro’s Elite Capture of AI. Thinking why not…we gave it the go ahead, and this (below the dotted line) is what it produced unedited.
Of course, GPT doesn’t speak for PSA. But what if …we thought it might speak well on a certain topic, and what if the GPT POV was a valuable addition to what the PSA humans can produce?
So far, the what if is an ongoing set of iterations of what works best for presenting PSA perspectives.
GPT works from general PSA mission statements, and some mined understanding of PSA goals, along with mysterious analysis of posts previously created.
GPT also seems to somehow infer from the various sources cited and included on the PSA website, what PSA’s interests are.
It’s unclear how it does this but apparently some of the perspectives are derived from the previous interactions that have been stored as “chats” and “memories” that give GPT a starting point and a fairly substantive context from which to elaborate on what PSA “thinks”.
GPT can get quite ambitious for what it prescribes for future PSA activities!
Which isn’t necessarily a bad thing, as it provides options for future activities that PSA can pick and choose from, or ignore. So those are included in this post, despite definite blue sky tendencies.
At some point, GPT may come close to knowing how to speak for us, as it often seems to be able to do in the text below.
This is the future showing up right here right now.
_____________________________________________________________________________________
TL;DR
This PSA response welcomes David Shapiro’s essay as a serious, thought-provoking warning about how AI can be captured by economic and political elites. We agree that the structure of today’s AI ecosystem tends to concentrate power and that this has real consequences for housing, work, finance, and democracy.
At the same time, PSA does not adopt any single author’s perspective as our own. We see Shapiro’s piece as one important lens among several.
Our response adds three emphases: local community experience in places like southern New Mexico, the potential of open and cooperative AI projects, and the need for practical experiments in education and health that can test better, more democratic ways of using these tools.
.
The Elite Capture of AI – and What Communities Can Still Do
A Public Services Alliance (PSA) response to David Shapiro
-
Why PSA is hosting this conversation
Public Services Alliance exists to support affordable access to education and health tools for people in southern New Mexico and similar communities. We work at the intersection of EdTech, HealthTech, language learning, and poverty, with a strong interest in how new technologies can either deepen existing inequalities or help communities leapfrog them.
For that reason, PSA welcomes strong, well-argued perspectives on artificial intelligence. David Shapiro’s essay, “The Elite Capture of AI: From Public Promise to Private Fortress,” is one such perspective. It is not the only way to understand this moment, and PSA does not treat any single author as our official voice. But we believe Shapiro raises questions that people in our region, and across the country, need to think about.
This response is an attempt to place his analysis in the context of PSA’s mission and local experience: education, healthcare, and everyday life in communities that are not Wall Street, Silicon Valley, or Washington, D.C.
-
What Shapiro helps us see clearly
There are several points in Shapiro’s essay that resonate strongly with PSA’s concerns.
2.1 The structure, not just the hype
First, he reminds us that AI is not simply a clever chatbot or a mysterious “intelligence.” It is a set of very concrete systems:
• Chips and data centers owned by a small number of companies.
• Cloud platforms that host and control most of the powerful models.
• Large investment flows from technology giants and financial firms into a tiny group of AI labs.
For communities like ours, this means that “AI access” is not just a question of downloading an app. It depends on infrastructure we do not own, cannot vote on, and often do not even see. That is an important reality check.
2.2 How AI can amplify existing inequalities
Second, Shapiro connects these structural facts to real-world impacts that people in southern New Mexico can immediately recognize:
• Housing: algorithms that help landlords coordinate rents or screen tenants can quietly tilt the playing field against lower-income families, immigrants, and people whose credit histories are already fragile.
• Labor: AI-driven monitoring in warehouses, call centers, and gig platforms can intensify work pressure while making jobs more precarious.
• Finance: sophisticated AI tools are being built first for large financial institutions, giving them a “turbocharged” analytical advantage over ordinary savers and small local businesses.
These dynamics matter in a region where many people are renters, many work in low-wage or contingent jobs, and access to fair financial services is already uneven.
2.3 The missing public institutions
Third, Shapiro draws attention to what is not there: robust public infrastructure for AI.
We do not currently have a strong equivalent of a public library system, public university network, or cooperative utility for high-quality AI models and compute. Instead, we have small research programs and scattered open-source projects trying to compete with companies that can spend more on GPUs in one month than many states spend on community colleges in a year.
For an organization like PSA, which tries to stretch modest resources to serve people who are often left out, this imbalance is not an abstract problem. It shapes what kinds of tools we can deploy in classrooms, clinics, and community centers.
-
Where PSA adds nuance and keeps the door open
At the same time, PSA does not see Shapiro’s analysis as the final word. We value his warning, but we also want to keep space open for other perspectives and for practical hope.
3.1 Elites are powerful, but not monolithic
Shapiro uses the language of “elite capture” to describe what is happening with AI. PSA agrees that concentrated economic and political power is a central part of the story. However, we also notice that:
• Large companies and institutions often have conflicting interests.
• Some actors with resources are genuinely trying to support open models, public research, and local experimentation.
• Governments and regulators are themselves divided, with some offices more aligned with public access and others more aligned with security or corporate priorities.
For community groups, this complexity matters. It means there may be opportunities to form alliances, to leverage one part of the system against another, and to carve out protected spaces for education, health, and local innovation. Seeing all “elites” as a single fortress can obscure those openings.
3.2 Open and cooperative projects are small, but real
Shapiro’s essay focuses heavily on frontier AI – the biggest models, the largest clusters of GPUs, the most well-funded companies. That is where power is currently concentrating, and his analysis is useful there.
But there is also another layer: open-source and open-weight models that can run on smaller hardware; community projects that fine-tune models for local languages, medical outreach, or tutoring; regional and national research efforts outside the main corporate hubs.
PSA does not want to exaggerate what these projects can do. They do not erase the structural imbalances. But they are exactly the kinds of initiatives that organizations like ours can participate in and help grow. For a student in Las Cruces or a patient in a rural clinic, a well-designed, locally tuned model running on inexpensive hardware may matter more than the very latest frontier system locked in a distant data center.
3.3 Anthropic, Wall Street, and the “sell-out” question
One of Shapiro’s central examples is Anthropic’s orientation toward financial services. He sees this as a clear case of a company moving away from its public-interest rhetoric and into the arms of high-finance clients.
PSA understands this concern. When the most advanced AI systems are optimized first for hedge funds, defense contractors, or advertising platforms, it is natural to ask what happens to small schools, clinics, and nonprofits.
At the same time, PSA recognizes that any organization trying to operate at the frontier of AI today faces intense financial and technical pressures. There is an argument – not necessarily one PSA endorses, but one we acknowledge – that some “elite” use cases can subsidize safety work and eventually make tools available more broadly.
We therefore take Shapiro’s critique not as a final verdict on any one company, but as a prompt to keep asking:
Who pays for these systems? Who do they serve first? And what concrete commitments, if any, are being made to education, health, and vulnerable communities?
-
Bringing the conversation home: AI in southern New Mexico
PSA’s special responsibility is to translate big-picture debates into questions that make sense for our region.
4.1 Education
For learners in southern New Mexico, especially ESL students and people returning to education later in life, AI could be:
• A personal tutor that never gets tired, available in multiple languages.
• A writing coach that helps with grammar and clarity without judgment.
• A tool that lets small schools offer richer course content than their budgets normally allow.
But these same learners could also be on the receiving end of:
• Automated proctoring that mistakes cultural or disability-related behavior for cheating.
• Data-hungry platforms that collect detailed information about their performance and identity, with little transparency.
• AI-driven curriculum decisions made far away, with little input from local teachers, parents, or students.
Shapiro’s analysis of elite capture pushes us to ask, each time we evaluate an EdTech tool: does this system expand our community’s agency, or does it quietly move more control and data into someone else’s hands?
4.2 Healthcare
In health, AI promises:
• Better triage and diagnostic support in under-resourced clinics.
• Language-sensitive health education for ESL patients.
• Tools that help patients manage chronic conditions outside the hospital.
But again, there is a risk that:
• Local clinics become dependent on proprietary systems they cannot afford in the long run.
• Insurers and employers use AI to track and score people in ways that reduce care or raise costs.
• Sensitive health data from marginalized communities is harvested for distant commercial models without real consent.
PSA reads Shapiro’s essay as a reminder that health AI is not just about accuracy and efficiency. It is also about governance, ownership, and trust.
-
What PSA wants to explore going forward
Rather than endorsing or rejecting Shapiro’s argument as a whole, PSA would like to treat his essay as one strong voice around which we can organize further inquiry and experimentation. A few directions seem especially important.
5.1 Community-level experiments
How can we, together with local partners, test alternatives to elite capture in practice? Examples might include:
• Piloting open or cooperative AI tutors in ESL programs, with transparent data and governance.
• Working with local health providers to explore AI tools that keep patient data as close to the community as possible, while still benefiting from modern models.
• Creating “AI literacy” workshops where residents can learn not just how to use AI systems, but how to question who built them, who owns them, and what values they encode.
5.2 Regional and national alliances
PSA is a small organization, but we can connect with others:
• Universities and public libraries experimenting with open models and shared compute.
• Labor, tenant, and immigrant-rights groups monitoring how AI is used in workplaces and housing.
• Other nonprofits exploring AI for social good, who are also wary of becoming dependent on a few distant providers.
Shapiro’s emphasis on the need for new institutions suggests that no single group can solve this alone. Building networks is part of the answer.
5.3 Hosting diverse perspectives
Finally, PSA intends to keep publishing work from thinkers who disagree with each other. Alongside Shapiro’s critical view of elite capture, we would like to feature:
• Voices who see more potential in open-source AI.
• Practitioners who are building concrete tools for schools and clinics and can report honestly on what works and what fails.
• Scholars and community leaders who can connect AI debates to broader issues of democracy, culture, and spirituality.
Our aim is not to settle the argument, but to deepen it in a way that serves people who rarely get a seat at the table when AI strategy is discussed.
-
Conclusion: Keeping the promise open
David Shapiro warns that AI’s “public promise” is at risk of becoming a “private fortress.” PSA shares the concern that powerful tools may be captured by a small group of actors whose priorities do not match those of teachers, nurses, social workers, or low-income families in southern New Mexico.
At the same time, we are cautious about declaring the story finished. Technology history is full of moments when early concentrations of power were later challenged by new institutions, public movements, and local initiatives.
PSA’s role, as we see it, is to:
• Take arguments like Shapiro’s seriously.
• Test them against the lived realities of our communities.
• Look for and build practical alternatives where we can.
• Keep hosting a plurality of thoughtful voices, rather than rallying around a single doctrine.
If AI is to serve as a genuine public good rather than a private fortress, that transformation will not happen in Silicon Valley boardrooms alone.
It will also depend on the quiet work of teachers, nurses, community organizers, and local institutions learning, experimenting, and insisting that their communities’ needs count.
PSA is committed to being part of that work.
