Capability has outpaced norms
Systems can now execute meaningful multi-step work. Our standard must move from merely usable to auditable, reversible, and legible.
Principles
Software now drafts, decides, and acts. Interfaces are shifting from screens you drive to systems that can move on your behalf.
We are pro-acceleration and pro-human at the same time. This is our North Start for designing software when intelligence is no longer scarce.
Agentic systems are shipping. Skill requirements are shifting rapidly. Infrastructure costs are becoming product constraints. Principles must be strong enough for this reality.
Systems can now execute meaningful multi-step work. Our standard must move from merely usable to auditable, reversible, and legible.
The biggest risk is not only displacement. It is exclusion: people who could benefit but cannot trust, verify, or safely operate the tools.
Responsible AI is no longer a policy document. It is expressed in defaults, safeguards, and the user's ability to intervene.
If software optimizes only for speed, people lose judgment over time. We design for capability lift, not cognitive outsourcing.
Wasteful interaction patterns are no longer just inefficient. At scale, they become infrastructure debt.
Good interfaces widen participation. Bad interfaces concentrate leverage in the hands of already technical users.
These are moral and practical guidelines. We use them as decision criteria in product work, not as decorative language.
Moral stance: People should not lose authorship of outcomes that materially affect them.
Practical rule: Use graduated autonomy: draft, explain, confirm, execute. For high-stakes actions, provide explicit approval and a clear rollback path.
Moral stance: Confusion transfers risk from the system to the user.
Practical rule: Every consequential action must answer three questions: what happened, why it happened, and what to do next.
Moral stance: Speed is not a virtue when the blast radius is large.
Practical rule: Introduce checkpoints for irreversible actions, legal commitments, health decisions, and money movement. Remove friction everywhere else.
Moral stance: People should not have to trade dignity for utility.
Practical rule: Minimize collection, expire data by default, scope permissions tightly, and explain data flows in plain language.
Moral stance: A system that cannot be audited cannot be trusted.
Practical rule: Keep clear traces of decisions, tools used, and user approvals. Correction paths must be first-class product features.
Moral stance: Tools should make people more capable over time, not more dependent.
Practical rule: Surface uncertainty, expose assumptions, and make verification natural in the workflow.
Build systems that expand human capability, preserve human agency, and earn trust by design.
If this resonates, follow what we do on the blog or reach out at contact@batjko.com.