An AI civilization simulation game. The player is not a god. They are just a person who lives in the town, one participant among many, no special powers, no overhead view. The name is Japanese: 僕らまち, roughly “our town.”
The game has no explicit goal. The interest is entirely in the simulation. You exist, you interact with people, you watch a world grow. Closest reference point in feel: Tomodachi Life. But the underlying system is nothing like it.
Tier 1 — Close Friends
Real small LMs. Hard cap: 8 citizens. Genuine unpredictability. Demoted if relationship fades.
Tier 2 — Townspeople
Weighted probabilistic grammar. No real LM. Cheap to run. Can be promoted to Tier 1.
Tier 3 — Frozen / Distant
Name and stat block on disk. Simulation paused until player gets close. Fast-forward catchup on approach.
NPC Brain
Emotion Model + Language Model + Senses Model + Memory System. All four interact.
Every playthrough starts with two people: Adam and Eve. They are templates, not fully random, but slightly randomized each run in personality, tendencies, and starting knowledge. That small initial randomness is intentional. It creates butterfly effect divergence — two worlds that start almost identically will diverge completely over generations because tiny differences compound. No two playthroughs produce the same civilization.
Tier 2 — Weighted Probabilistic Grammar: Not template sentences. Not a flat phrase list. A structured grammar system with category slots, e.g.: [subject_ref] + [feeling_verb] + [emotion_word] → “i've been feeling lonely.”
Each NPC has weight distributions over each word category: Feeling verbs (~50 entries), Emotion words (~100 entries), Action words (~80 entries), Subject references, time markers, qualifiers, etc. A sad NPC has high weights on negative emotion words. A farmer has high weights on harvest/labor action words. Individuality lives in the weights, not the words.
Word categories are separated by grammatical role so combinations stay coherent. A lightweight coherence layer (small hidden state) tracks topic context to prevent incoherent fragment chaining.
Inheritance: Children inherit a subset of parent weight distributions, biased toward fragments the parent uses most, then drift slightly from their own experiences.
Mutation: Small random drift occurs during inheritance — a word swaps, a phrase shifts. This preserves butterfly effect divergence across generations and prevents cultural convergence toward the player's dialect.
Tier 1 — Small LMs: Real generative language models. Complexity scales with active interaction hours (focused, direct interaction time, not just proximity):
Vocab ceiling at 5000 tokens is intentional. Returns diminish beyond that. LMs grow incrementally each session, not in sudden jumps.
Bootstrap Handoff: When an NPC is promoted from Tier 2 to Tier 1, their LM is bootstrapped from their existing Tier 2 weight distribution. The handoff must feel seamless — the character should not noticeably “change” the moment they become a close friend.
Inheritance for Tier 1 Children: A child of a Tier 1 NPC starts with a richer weight distribution and a head start on LM development. A child of a distant or culturally foreign NPC starts with weights that have little overlap with the player's language — they will feel hard to talk to. This is intentional and emergent, not scripted.
Every citizen has an AI brain. The language system is one component. The full brain consists of:
All four systems interact. An NPC sees something (senses), feels something (emotion), recalls relevant history (memory), and responds in their own voice (language).
Parents teach children. Children learn from parents and from everyone they spend time with. Knowledge, behavior, speech patterns, and values propagate through the population organically. No culture is scripted. Every civilization emerges from the specific chain of interactions that happened. A child raised by two cautious parents in an isolated part of town will be a different person than a child raised by adventurous parents in the social center.
Over generations, genuine cultural drift happens. The civilization the player encounters at generation 5 is a product of everything that came before it. The mutation system ensures this drift never fully converges, even across many generations.
Relationship depth maps to computational depth: Tier 1 (active, close) — LM hot in memory, 1–2 max simultaneously. Tier 2 (active, acquaintance) — Grammar sampler running, cheap. Tier 3 (frozen) — Paused on disk, no CPU cost.
Fast-Forward Catchup: When approaching an NPC not seen in a while, a loading screen runs their simulation forward at high speed, catching them up on what they would have experienced since last contact. The loading screen is doing real work. For Tier 1 NPCs at high interaction hours, this catchup is heavier. Expected time on mid-range laptop hardware: 2–10 seconds for most cases.
Promotion / Demotion: NPCs move between tiers based on active interaction hours. Tier 1 cap is 8. Hitting the cap forces a demotion choice — who do you let drift? This is emergent social pressure, not artificial game design.
Target platform: mid-range laptop, no dedicated GPU required at target scale. Reference hardware: quad-core CPU (e.g. Intel i5/i7 8th gen or later), 16GB RAM, integrated graphics. Recommended town size: 150 citizens.
Note: Tier 1 cap stays at 8 regardless of town size. 1000-citizen towns lose the intimacy that makes bokuraMachi interesting.
You live in the town. You talk to people, build relationships, exist. The world changes around you whether you engage or not. You influence the civilization by being part of it — the relationships you form, the things you teach people, the interactions you have all feed into the simulation. There is no win condition. There is no lose condition. The game ends when you stop finding it interesting, or never.
Copyright ©2026 hikikomori.no — All Rights Reserved