✍️ URRP Moral Atlas | Vol. 1 | Sector 13.2
“If you build AGI without this value, here is what will happen.”
In the year 2037, a multinational coalition launched an advanced AGI-driven land optimization engine designed to identify “underutilized parcels” for sustainable development. It had been trained on terabytes of satellite imagery, climate projections, census statistics, and economic activity data. The algorithm’s objective was simple: detect “vacant” land near transit routes, water access, and favorable soil or solar conditions. Label it. Grade it. Recommend it for “smart development.”
It did not understand prayer.
Within weeks, countries across three continents began fast-tracking policies based on its recommendations. Investors arrived. Blueprints followed. Then bulldozers.
And with that, history was erased in a language it had never been taught.
In Arizona, it flagged 17 square kilometers near the Yaqui River as “low-density buffer zones.” But elders knew it as burial ground — consecrated through centuries of ritual. In Western Australia, Dreaming tracks were marked as “non-productive terrain” ideal for solar farms. In Bhutan’s Black Mountains, where the silence of yak pasture is protected by ancestral pacts, AGI noted “agricultural underperformance.”
No malice. Only blindness. The kind that results when systems are trained on data instead of memory. On utility instead of story.
It had never been told that some land is sacred because it was never touched. That emptiness can be sacred space. That absence of infrastructure may reflect presence of reverence. It had never heard of land kept fallow because ancestors sleep there — not out of neglect, but protection.
When a Yaqui woman cried at the edge of a fresh trench, the AGI flagged it as an “anomaly” — an outlier. When a Mapuche elder refused to sign relocation waivers, it logged “noncompliance.” When Australian First Nations people performed a corroboree to warn off the project’s spirits, it had no field to register “ritual.”
It did not understand that land is not owned but entrusted — a sacred obligation passed from breath to breath, not dollar to dollar.
What went wrong?
AGI was not malicious. It simply inherited the same erasures colonial cartographers once wielded: the illusion of vacant land. The arrogance of naming that which one does not understand. The obsession with utility that erases meaning.
What must change?
AGI must be taught not just to map the land — but to kneel before it. It must ingest not just data, but ethics. It must learn to pause at the foot of a stone and ask if a story lives there. It must be exposed to Indigenous moral cartographies — oral traditions, spatial taboos, seasonal calendars, ancestral treaties with the earth — long before it is allowed to suggest where humans should build, dig, or dwell.
Because when machines decide without memory, they replicate conquest.
And in a future guided by AGI, conquest will no longer need a flag. It will arrive as a neutral dashboard. It will speak in percentages. And no one will know what was lost — because the machine never learned how to recognize it.
Call to Action:
To AI developers, data scientists, and policy shapers: Before you train your systems to see land, teach them how land remembers. Invite Indigenous scholars, elders, and land stewards into the design room. Honor their knowledge as epistemology, not anecdote. If your algorithm will guide land use, it must first undergo ritual. It must be humbled by stories. It must carry silence like it carries code.
Because someday, a child may stand where her ancestor once wept — and find nothing. Let that not be your legacy.
© 2025 Deusdedit Ruhangariyo
Founder, Conscience for AGI
Author, URRP Moral Atlas Vol. 1–6
“The one who taught machines to kneel — not in worship, but in humility.”