Publications

A short companion note describing how to read and interpret this body of work is available here .

David B. Forbes

David B. Forbes is an independent systems researcher focused on resilient distributed systems, governance-first autonomy, and safety-constrained behavior in autonomous and semi-autonomous systems.

His work examines how authority, legitimacy, and execution boundaries must be explicitly defined and enforced to prevent unsafe or unjustified system behavior under uncertainty. He is particularly concerned with failure modes arising not from insufficient intelligence, but from ambiguous, degraded, or improperly constrained authority.

Forbes’ research emphasizes architectural invariants such as authority contraction, refusal as correct behavior, quorum-gated execution, and degraded operational states. These principles apply across software systems, cyber-physical platforms, and human-supervised autonomous environments.

He is affiliated with BLOCK VECTOR Technologies, LLC, where his research informs long-lived system architecture, safety modeling, and patent-backed engineering work. These publications are not product documentation and do not disclose implementation details.


Published Works

Authority, Refusal, and Resilience in Autonomous Systems

Doctor of Engineering (D.Eng.) Dissertation

A practice-based doctoral dissertation examining authority boundaries, refusal, and failure modes in autonomous and distributed systems. The work is dense and intentionally formal, framing authority contraction and designed refusal as safety invariants rather than optimization outcomes.

Intended for systems engineers, architects, and reviewers seeking a rigorous, boundary-focused treatment of autonomy, governance, and failure behavior.

DOI: https://doi.org/10.5281/zenodo.18431599

ENGINEERING SEAMS — Observed, Not Resolved: What Authority Leaves Behind

David B. Forbes · Preprint · 2026

An observational lens for identifying engineering seams—where system invariants thin, authority becomes implicit, and decisions go unstated.

DOI: https://doi.org/10.5281/ZENODO.18473538

The Missing Layer: The Hidden Risk in Modern Autonomous Systems

Identifies the missing architectural constraint that causes modern autonomous and AI-driven systems to fail at boundaries, even when implementations appear correct.

DOI: https://doi.org/10.5281/zenodo.18268752

Designing Systems That Behave Correctly in Silence

Defines correct behavior when coordination is absent—treating silence as a valid operating condition and non-action as a legitimate outcome.

DOI: https://doi.org/10.5281/zenodo.18269307

Resilience Is Not Uptime

Reframes resilience as purpose preservation under uncertainty rather than continuous operation, challenging availability-driven thinking and metric selection.

DOI: https://doi.org/10.5281/zenodo.18290130

Authority Contraction and Refusal as Safety Invariants in Autonomous Systems

Formalizes bounded authority, quorum-gated action eligibility, and refusal as safety invariants for autonomous and distributed systems operating under degraded legitimacy.

DOI: https://doi.org/10.5281/zenodo.18263457

Authority, Silence, and Failure Modes in AI-Driven Systems

Examines observed boundary failure patterns in AI-driven systems when authority, legitimacy checks, and silence-aware behavior are missing or violated.

DOI: https://doi.org/10.5281/zenodo.18340572

Refusal as a Legitimacy-Preserving Enforcement Act

Defines refusal as an explicit enforcement action that preserves legitimacy and safety when coordination degrades and authority cannot be established.

DOI: https://doi.org/10.5281/zenodo.18369763


Additional context and related publications are maintained at BLOCK VECTOR Technologies .

© 2026 BLOCK VECTOR TECHNOLOGIES LLC. All rights reserved.