The work moves across a small set of linked institutions, a consulting practice, an applied-research lab, and a governance-oriented think tank. Each one informs the others; each engagement benefits from all three.
The work spans a consulting platform, an applied research lab, and a not-for-profit forum, three institutions, one connected practice. Each surface informs the others; each engagement benefits from all three.
Strategic technology counsel for regulated, mission-driven, and growth-stage organizations, AI governance, enterprise modernization, privacy, and digital transformation. Principal-led delivery, direct senior access.
An applied research platform building high-impact AI systems for high-risk and regulated domains, from pneumonia diagnosis to novel neural architectures, and the governance structures that make their deployment defensible.
A global think tank advancing responsible AI and emerging technology governance, standards, frameworks, and public conversation across scholarship, industry practice, and policy.
There is a version of this work that happens in decks. Frameworks are referenced; principles are cited; a policy document is circulated. The photograph of governance is taken, filed, and rarely opened again. This practice is not that.
The work begins where the slide deck ends, in the intake form nobody has filled out correctly in six months, in the model inventory that nobody trusts, in the incident runbook that was never tested, in the quiet understanding among the technical staff that the policy is aspirational and the production system is something else. Governance is the distance between those two things. Closing that distance is what the practice does.
“Policy that cannot survive contact with an engineer is not policy. It is decoration.”
I came to this work from both sides, as someone who has written the policies and as someone who has had to live with them. That dual experience is unusual, and it shapes how engagements run. A framework that cannot be operationalized is a liability, not an asset. A regulation that cannot be implemented by a real team, in a real environment, with real budget constraints, is a regulation that will be worked around. The job is to design the program so that the easy path and the right path are the same path.
That requires a specific kind of attention. It requires sitting with the details, the data-flow diagram that no one has updated, the vendor contract nobody has re-read, the board report that says the right words but describes the wrong system. It requires willingness to tell a client that the answer is no, or that the answer is yes but it is going to cost more than they want to spend, or that the answer is that the problem they are asking about is not actually the problem.
And it requires a point of view. Not a neutral one, I don't believe neutrality is an honest posture for this work, but a considered one. Emerging technology should be advanced, not resisted, but advanced in a way that is sustainable, ethical, and operationally real. That last word is doing the most work. Operationally real means the rubber has to meet the road; the program has to function on a Tuesday afternoon in the middle of a quarter, with the people who actually work there, using the systems that actually exist.
The practice is built around that standard. Everything else follows.
The work clusters around four areas that keep colliding with one another in the real world, and that is exactly the point. Governance without privacy is incomplete; privacy without security is theoretical; all three without someone in the executive seat to drive them are a stack of unread documents.
Every engagement is scoped to the situation, but most move through the same five movements. Not stages of a waterfall, overlapping passes, each one making the next more accurate.
Engagements start with a complimentary 30-minute conversation. The goal is not to sell, the goal is to understand the situation well enough to know whether the practice is the right fit, and to say so honestly either way. If it isn't, I'll tell you where I'd point you instead.
Before a single recommendation is made, the landscape gets mapped: stakeholders, systems, regulatory exposure, internal politics, existing documentation, and the gap between stated policy and lived practice. The map is shared back to the client verbatim.
Whatever gets built, framework, program, policy, or role, is designed to function under normal operating pressure, not in a vacuum. Every artifact is stress-tested against the question: could a competent person, handed this, run it?
Delivery is not a document drop. Where engagements continue past design, I work directly with the internal team, sitting in the meetings, drafting the first three reports, reviewing the first incident, co-presenting to the board, until the thing is breathing on its own.
The goal is never indefinite retention. The goal is an internal capability that outlasts the engagement, with a clear transfer plan, documented reasoning, and a named successor inside the client organization. Consulting that creates dependency is consulting that failed.
Background, credentials, training, teaching, and the longer story behind the practice, in one place.
Advance emerging technology — in a way that is sustainable, ethical, and operationally real.
If any of the above resonates with the situation you're in, regulated, mission-driven, or at a juncture where the governance of AI actually matters, the right next step is a 30-minute call. No pitch, no pressure.
info@noahkenney.com