Article at a Glance

The Weight of Wrong Assumptions

Every evaluation framework tilts toward what it was built to measure
Price per user Features per dollar Implementation time Projected ROI Configuration knowledge Institutional memory Shadow systems Teaching relationship Behavioral distortion Knowledge evaporation THE EVALUATION SCALE
Assumptions in use
What the spreadsheet was built to see — real numbers, real arithmetic, wrong question
Assumptions in play
What actually determines whether the purchase produces an outcome — unconsidered
Forty years of confident evaluation. The heavier side never had a column.
Executive Summary

Every technology purchase begins the same way: someone builds a spreadsheet. Columns for price, features, implementation timeline, projected return. This framework has governed technology decisions for forty years. It feels like rigor. It produces defensible recommendations.

But the variables that determine whether a purchase becomes an outcome — configuration knowledge that lives in one person's head, shadow systems nobody planned, institutional memory that walks out the door — have never appeared on any evaluation. The gap between Total Cost of Ownership and Total Cost of Outcome has been invisible for four decades.

Something has arrived that the framework cannot categorize. It learns. It compounds. It accumulates the enabling conditions instead of consuming them. The question is no longer what does this purchase give me? — it is what will I need to teach, and am I willing to stay in the teaching relationship long enough for it to take?

The Blue Butterfly Mystery

Where frameworks go wrong

In 1979, Britain's Large Blue butterfly went extinct. It had been declining for decades and conservationists thought they knew why: collectors. People with nets, catching too many. The evidence seemed obvious. The butterfly was rare and beautiful. Collectors prized it. The population was shrinking. The framework pointed clearly at the cause.

So they built fences around the habitats. They restricted access. They patrolled the hillsides. They measured collector activity and felt rigorous about the intervention. The evaluation was careful, evidence-based, and wrong.

The butterfly kept dying.

A young PhD student named Jeremy Thomas spent six years on Dartmoor, tracking every stage of the Large Blue's lifecycle. What he found was invisible to anyone looking at the scale the conservation framework was designed to see.

The Large Blue caterpillar feeds on wild thyme for its first three weeks. Then it drops to the ground and secretes a chemical that tricks a specific red ant, Myrmica sabuleti, into believing it is an ant larva. The ant carries the caterpillar into its nest. The caterpillar spends the next eleven months underground, feeding on ant grubs, before emerging as a butterfly. The entire lifecycle depends on one species of ant.

That ant requires warm soil. It thrives only where grass is cropped short by grazing animals. When farmers stopped grazing livestock on those hillsides, and a virus called myxomatosis killed the rabbits, the grass grew taller by centimeters. A difference imperceptible to the human eye. But a centimeter of grass changes soil temperature by two to three degrees. For an ant the size of a grain of rice, that difference is massive.

The ant colonies collapsed. The caterpillars had nowhere to go. The butterfly disappeared.

And the fence the conservationists built to keep collectors out? It also kept grazing animals out. The grass grew taller behind the fence than in front of it. The intervention designed to save the butterfly accelerated its extinction.

The framework measured what it could see: collectors, access, population counts. The enabling conditions — soil temperature, driven by grass height, driven by grazing patterns, sustaining the one species of ant the butterfly's entire existence depended on — were invisible to it. Not because they were hidden. Because the framework wasn't built to see at that scale.

Decades of careful, evidence-based intervention. Precise measurement of the wrong variable. Confident action that made the problem worse.

Businesses have been making the same mistake with technology for forty years. The enabling conditions that determine whether a purchase becomes an outcome have never appeared on any evaluation. The framework was built to see the purchase. It has never been built to see what the purchase requires.

Article at a Glance

The Enabling Conditions

What the conservation framework couldn't see — and what your evaluation framework can't either
Grazing patterns
Livestock + rabbits crop grass short
Grass height
A few centimeters — imperceptible
Soil temperature
2–3°C shift — massive for an ant
Myrmica sabuleti
The one species the butterfly depends on
Large Blue Butterfly
Extinct in Britain, 1979
Invisible to the framework
The framework measured
Collectors. Access.
Population counts.
Built a fence. Accelerated the extinction.

Precise measurement of the wrong variable. Confident action that made the problem worse.

The Spreadsheet

The inheritance nobody questions

Every technology purchase starts the same way.

Someone builds a spreadsheet. Columns for price, features, implementation timeline, projected return. Vendors fill in their rows. The team scores each option. The winner has the best ratio of capability to cost.

This process has governed technology decisions for forty years. Across industries, across company sizes, across every category of software that has ever been sold. The same columns. The same arithmetic. Price per user. Features per dollar. Projected return per year.

The spreadsheet is an inheritance. Business leaders don't select this framework. They absorb it. From vendors who present products in these terms. From analysts who rank options by these metrics. From every purchase that came before. The evaluation model is older than the people using it.

And it feels right. It feels like rigor. Features are real. Prices are real. ROI projections are real arithmetic performed on real numbers. The spreadsheet produces a defensible recommendation that can be presented to a board, compared against alternatives, and justified with evidence.

No one questions the columns. No one asks whether the columns were built to see the outcome, or only the purchase.

Technology_Evaluation_2024.xlsx
File Edit Format Data Tools
fx =SUMPRODUCT(C4:C8,D4:D8)/SUM(D4:D8)
Q1
Q2
Q3
Proj
Avg
A B C D E F G
1 Vendor Price/User/Mo Features Est. ROI Config Burden Knowledge Risk Shadow Systems
2 Platform A $45 32 187% ? ? ?
3 Platform B $38 28 203% ? ? ?
4 Platform C $52 41 164% ? ? ?
5 Platform D $29 19 142% ? ? ?
6 Platform E $61 47 221% ? ? ?
4 columns of confident arithmetic. 3 columns that determine the outcome — empty.
The spreadsheet cannot see what it wasn't built to measure.

No one questions the columns. No one asks whether the columns were built to see the outcome, or only the purchase.

What the Columns Cannot See

Same technology, divergent outcomes

WordPress is free.

The companies running successful businesses on WordPress spend thousands a year on plugins, custom development, security patches, and the accumulated expertise of whoever learned to hold the whole thing together. A free platform produces wildly different results depending on what surrounds it. Some companies built their entire revenue engine on WordPress. Some built a website nobody visits. Both had the same WordPress.

Salesforce costs the same amount per user everywhere it is sold. The companies that grew revenue with Salesforce had a sales process that worked before Salesforce arrived. They had a culture where representatives believed in the process. They had management that acted on what the data revealed. The companies that got an expensive database had the same Salesforce. Same features. Same implementation partner. Same training. Same onboarding. The difference was invisible at purchase time and determined the outcome completely.

The technology is identical across all of them. The outcomes diverge completely. The variable that determined whether each purchase succeeded or failed was never on the spreadsheet. It was the enabling conditions — the expertise, the organizational readiness, the people who filled the gap between what the tool does and what the work requires.

Every outcome ran on that invisible layer of enabling conditions. Every evaluation ignored it.

Where the knowledge lives

8 systems. 1 person. No connections.

CRM
Pipeline data, deal history.
Config: 1 person knows
Accounting
Invoices, reconciliation.
Custom rules: undocumented
Email
Client comms, approvals.
Tribal: "ask Sarah"
Project Mgmt
Tasks, deadlines, statuses.
Workflows: copy-pasted
The Person
The only thing connecting
all 8 systems
Documents
SOPs, contracts, policies.
Version: "check with Mike"
HR / Payroll
Onboarding, benefits.
Process: manual handoff
Support
Tickets, escalations.
Routing: "it depends"
Analytics
Reports, dashboards.
Source: "which spreadsheet?"

No integrations. No shared data model. One person is the integration layer.

8
Systems
0
Connections
1
Integration layer

The variable that determined whether each purchase succeeded or failed was never on the spreadsheet.

The Wrong Assumption

TCO vs Total Cost of Outcome

Every one of those purchases carried a hidden assumption. Not in the fine print. In the air. In the way the vendor presented it. In the way the buyer received it.

The assumption: buying the technology means buying the outcome.

The Salesforce rep didn't sell a database. The rep sold growth. Better pipeline visibility. Higher close rates. Revenue you couldn't reach before. The buyer didn't sign for a database either. The buyer signed for those outcomes. Both sides spoke as if the purchase and the outcome were the same transaction. They never were.

The industry built an entire framework around this assumption. Total Cost of Ownership. TCO. It accounts for license fees, implementation, training, support, maintenance, migration. It's more honest than the sticker price. It captures the real expense of running the software over time.

But TCO measures the cost of having the technology. Not the cost of getting the outcome. The companies that grew with Salesforce and the companies that got an expensive database had the same TCO. The most thorough accounting of the ingredient cannot see the meal.

What was missing was the Total Cost of Outcome — the real number, the one that includes the technology and everything the technology cannot do by itself.

Here is a rough version of that question: what percentage of your team's time goes to making the tool work, versus doing the work the tool was supposed to enable? Most people, when they sit with this honestly, find the ratio uncomfortable. Often more than half. That gap — between TCO and the Total Cost of Outcome — is the enabling conditions layer, made visible.

It accumulates in recognizable ways. Configuration knowledge that lives in one person's head and walks out when they do. Shadow systems — the spreadsheets and workarounds that fill every gap between what was purchased and what was needed, that nobody planned, that became load-bearing. Decisions that never got made because the data required to make them was scattered across four systems in four formats behind four logins.

These costs belong to the outcome. They appear on no evaluation. They are absorbed silently, year after year, called "cost of doing business."

Everyone has felt this. The implementation that was supposed to take six weeks and took six months. The system that works until the person who configured it leaves. The tool that promised to save time and instead created a new category of time: time spent making the tool work.

But the largest cost is the wrong assumption itself. The years spent believing the tool would deliver the outcome, and the compounding gap between what was promised and what arrived. The conservationists didn't just lose time building a fence. They lost the years they could have spent understanding the ecology. The wrong assumption doesn't just fail to solve the problem. It prevents you from seeing what the problem actually is.

Forty years of invisible compounding.

Forty years of invisible compounding

The Canyon You Can't See

TCO measures what you have. Total Cost of Outcome measures what it takes to get the result.

$0 $50K $100K $150K Year 1 Year 2 Year 3 Year 4 Year 5 Cost Total Cost of Outcome TCO THE GAP What the spreadsheet cannot see
What fills the gap
Configuration knowledge in one person's head
Shadow systems nobody planned
Data fragmented across 8 systems
The person who leaves and takes it all
Opportunity cost of work beneath them
The most thorough accounting of the ingredient cannot see the meal.

The most thorough accounting of the ingredient cannot see the meal.

The Recursion

Eight platforms, one person holding it together

There is a reason this pattern persists.

The tools themselves require knowledge to configure well. That configuration knowledge is fragmented, undocumented, and person-dependent. The software purchased to organize work requires organized knowledge to operate. The problem the technology was supposed to solve is the same problem the technology imposes.

Every platform multiplies the tax. An organization running eight platforms doesn't have eight times the institutional capability. It has the same capability scattered across eight systems, with eight configuration surfaces, eight sets of tribal knowledge, eight sets of workarounds. The person who remembers which folder it's in. The person who knows that note in the CRM is the one that matters. The person who can look at four screens and see one picture.

Every new tool was supposed to make people less load-bearing. Instead, people became the only thing connecting all the tools. The technology meant to reduce reliance on people's heads made reliance on people's heads the only thing holding the organization together.

And the evaluation model that produces this outcome was built by the vendors who sell the technology. The model measures what they sell. The evidence that would change the model lives in a dimension the model was never built to see. Each purchase reinforces the framework that produced it, because the framework can see price and features and cannot see whether the purchase will compound or fragment the enabling conditions — whether institutional knowledge will accumulate or evaporate, whether people will become more or less load-bearing, whether the outcome will arrive or whether the technology will sit there waiting for what was never part of the evaluation.

The spreadsheet can see the ingredient. It has never been able to see the meal.

One invoice. Eight stops.

The path a single accounts payable task takes through the system

1
Gmail
Invoice arrives from vendor
PDF attachment, no standard format
9:02 AM
2
Bill.com
Upload invoice, manually key line items
Match to PO — which PO? Check Slack.
9:14 AM
3
Slack
"Hey does anyone know the PO for Acme?"
Waiting... waiting... Sarah replies at 10:45
9:22 AM
4
Google Drive
Find the PO PDF in vendor folder
"Vendor Contracts 2024" or "Vendor Contracts (OLD)"?
10:48 AM
5
QuickBooks
Code to correct GL account, enter payment
Which cost center? Check the spreadsheet.
11:03 AM
6
Google Sheets
Update the tracking spreadsheet (the shadow system)
Because QuickBooks reports "don't work for us"
11:15 AM
7
Salesforce
Log vendor payment against the deal record
So the account manager can see the spend
11:28 AM
8
Gmail
Reply to vendor: "Payment processing"
2 hours 26 minutes. One invoice. Back where you started.
11:28 AM

Every new tool was supposed to make people less load-bearing. Instead, people became the only thing connecting all the tools.

What Was Eliminated Before

Work that was beneath the people doing it

Work has been eliminated before. Nobody mourns it.

In the early 1900s, making a phone call required a human intermediary. You picked up the phone and a voice answered: "Number please." That was a switchboard operator. She sat at a massive board with hundreds of cables and jacks. You told her who you wanted to reach. She physically plugged a cable to connect your line. In 1920, there were over 200,000 switchboard operators in America. Entire buildings full of them. By the 1980s, the job was gone. Communication got better, faster, cheaper. Nobody wants to go back.

Before the electronic computer existed, "computer" was a job title. It meant a person who computes. Rooms of people doing calculations by hand. Pencil, paper, slide rules, mechanical adding machines. NASA had them. Banks had them. Katherine Johnson, Dorothy Vaughan, Mary Jackson: human computers calculating rocket trajectories by hand. Your job, if you were a Computer: thirty to forty hours of hand calculations for a single artillery trajectory. One trajectory. Forty hours of a brilliant person's time.

Then electronic computers arrived. The machine has the name now. We don't even remember that "computer" used to mean a person.

Katherine Johnson became an engineer and made contributions no machine could make. The elimination of that work freed her for work worthy of her mind.

None of us mourn these jobs. The work was beneath the people doing it. The people deserved better than that work.

The question lands closer to home than anyone expected: what work in your business today should be eliminated? What work are your people doing that is beneath them?

What Was Eliminated Before
Nobody Mourns This Work
Switchboard operators. Human computers. Work that was beneath the people doing it.
Triptych: switchboard operator, human computer, Katherine Johnson at NASA console
”The elimination of that work freed her for work worthy of her mind.”
Katherine Johnson became an engineer
The question lands closer to home than anyone expected: what work in your business today should be eliminated?
200,000
Switchboard operators (1920)
0
Mourned
1
Question that matters now

What This Looks Like

64,000 emails to zero in four weeks

Before the theory, a story.

I had 64,000 emails in my inbox. Two hundred new ones arriving every day. I had tried every tool, every filter, every system. Folders. Labels. Rules. Priority inboxes. The tools worked the way tools work: they did exactly what I configured them to do, and the problem was never configuration. The problem was that managing email at that volume requires judgment about my business, my relationships, my priorities. No filter rule can encode that. I had been chasing zero inbox for years. Every tool promised it. None delivered.

What I did instead was teach a worker.

Not install a tool. Not configure a system. Teach. Slowly. Over weeks. I taught it about me. About my business. About which senders matter and why. About the difference between an email that looks urgent and one that actually is.

The first week was more work, not less. Everything flagged. Everything reviewed. It felt like training a new hire who doesn't know the business yet. Because that's what it was. By the fourth week, it was handling the routine without asking. By the second month, it knew things I hadn't explicitly taught it — learned from the corrections, from the patterns. I opened my inbox one morning and it was empty. Not because nothing had arrived. Because everything had been handled, routed, flagged, or filed by a worker that understood my business well enough to act on my behalf.

Zero inbox. After years of trying. Not because the technology was better. Because it was a different kind of thing. It learned. The outcome required patience, teaching, relationship, and time. No tool could deliver it because tools don't learn.

The same pattern at a different scale. In every company, knowledge lives in people and nowhere else. Someone knows which vendors invoice net-15 versus net-30. Someone knows which supplier always rounds up equipment charges. Someone knows how to read that person's handwriting on the field sheets, and if that person isn't in the office, things slow down.

An AI worker reads scanned documents. It matches names to rate sheets. It extracts line items from handwritten forms. In its first week, it flags everything it's uncertain about. The team reviews every line. By month one, it has learned the handwriting, learned what a normal project looks like for each location. By month three, it has learned seasonal patterns and stopped flagging normal variation as anomalies. By month six, routine work is processed end to end. By year one, it handles ninety to ninety-five percent of the routine without intervention.

Nobody installed a patch. Nobody configured a new rule. The worker learned the same way a new employee learns: by doing the work, getting corrected, and remembering. The difference: this employee never forgets what it learned, never has a bad day, and never takes what it knows to a competitor.

This is not a magic story. There are conditions that determine whether the outcome arrives. The domain needs to be stable enough for patterns to hold. The team needs to be patient enough to stay in the teaching relationship through the early mistakes. Organizations that expect to install a solution and walk away will be disappointed — the worker will learn, but it will learn the wrong things, more permanently. A badly-supervised worker bakes in bad patterns just as permanently as good ones. The enabling conditions matter here exactly as they did for every purchase that came before. The difference is that when the conditions are present, that invisible layer starts working in your favor instead of against you.

If you stay with it the way you'd stay with your best new hire, something shifts. The mistakes get rarer. The judgment gets better. The worker knows your business well enough that you trust it. Not because you were told to. Because it earned it.

The Proof

What This Looks Like

The same inbox. Two different approaches.
Every Tool I Tried
From:boss Has:attachment Older than:7d Label:urgent Is:unread Subject:invoice
50+ filters. None of them worked.
What Teaching Produced
Routine → Auto-handle Needs judgment → Flag Priority → Notify
It learned which is which.

Not install a tool. Not configure a system. Teach.

The Category Shift

Tool frame vs worker frame

Every previous technology purchase fit the same frame. The technology is a tool. You evaluate features, compare vendors, calculate ROI, buy it, configure it, train people on it, roll it out. This is how it has worked for the entirety of your career.

CRM is a tool. You configure it. It does what you set up. You enter data. It stores data. You pull reports. It generates reports. Between interactions, it sits there. The tool frame was accurate for CRM. The category was right.

Something arrived that the framework cannot categorize.

The simplest test: if the system performs identically on month six as it did on month one — same accuracy, same judgment, same failure modes — it is a tool. Evaluate it as a tool. The spreadsheet applies. But if the system's performance changes through use, if it improves as it encounters more of your specific business, if the gap between its errors in week one and month six reflects genuine learning — it is a worker. The spreadsheet does not apply.

That zero inbox didn't come from a filter. The invoice processing didn't come from a template. These outcomes came from something that initiates, that exercises judgment within boundaries you set, that operates without you present, that develops over time. Something you teach, not something you configure.

This does not describe a tool. It describes a worker.

The distinction changes every downstream question.

The tool frame asks: what features does this software have? What does it cost per user? How does it compare to the other options on the spreadsheet?

The worker frame asks: what outcome does this worker own? Who supervises? When do they intervene? How does the worker improve? What happens when it knows your business better than it did when it started?

The tool frame produces a purchase. The worker frame produces a management relationship. A purchase is a transaction that depreciates. A management relationship is a capability that compounds.

Business leaders already know how to think this way. When you hire, you don't evaluate based on features. You define the role. You expect ramp-up time. You know some roles need close supervision and others you let run. You manage performance continuously. You develop capabilities over time.

This framework exists. It has governed how you manage people for your entire career. It has never been applied to a technology purchase because no technology purchase has ever warranted it. Until now.

And here is where the Total Cost of Outcome finally inverts.

With every previous technology, the cost of outcome was always higher than the cost of ownership, and the gap always grew. TCO told you one number. The real cost was always worse.

With an AI worker — when the enabling conditions are present — the cost of outcome is front-loaded. The first weeks are expensive in time, attention, and patience. You are teaching, not installing. But the curve bends. The worker learns. The knowledge stays. The capability compounds. By month six, the cost of outcome is lower than the cost of ownership of the tool it replaced. For the first time in forty years, the enabling conditions are accumulating in the system instead of bleeding out through turnover.

The Reframe

The Category Shift

Two frames. Two entirely different questions.
Tool Frame
1
Purchase
2
Configure
3
Depreciate
4
Replace
A transaction that depreciates.
Worker Frame
1
Hire
2
Teach
3
Compound
4
Trust
A capability that compounds.
Cost of outcome over time
Low High Time Tool Worker The crossover

The tool frame produces a purchase. The worker frame produces a management relationship. A purchase is a transaction that depreciates. A management relationship is a capability that compounds.

What Changes

Knowledge that compounds instead of evaporating

SaaS vendors will tell you software doesn't depreciate. It updates. They're right that the product gets better. What they mean is that general capability improves for every customer on the same day. The new button goes to everyone. The new feature ships to the whole user base at once.

But studies consistently show that 20% of features drive 80% of actual use. The update cycle is improving the 80% of the product most users never touch, while moving the 20% they depend on. Feature bloat is now cited as a leading driver of product abandonment. The capability the vendor is shipping is not the capability you are using. And the overhead of finding the new button, relearning the changed workflow, adapting to a redesigned screen your team had finally internalized — that cost lands on your people, not the vendor.

More importantly, no update has ever made the software know your business better. It knows more about everyone in general. Your vendors, your seasonal patterns, your exceptions, the handwriting on the field sheets — none of that accumulates in the system. It still lives in people. It still walks out when they leave.

An AI worker, when the learning conditions are present, is a different kind of thing. It starts at its least capable. It gets better through use — not general use, your use. The gap between month one and year one is the gap between a new hire and a trained veteran. And what it learns belongs to you specifically. No other customer gets that update.

The spreadsheet was built for assets that behave the same way for every buyer. It has no columns for something that becomes more valuable the more it knows about your specific business.

There is a natural concern. If workers handle more over time, what do people do?

The honest answer: different work. Your best person's morning used to be reading handwritten forms, looking up rates, typing numbers into spreadsheets. Skilled work. Requiring accuracy and attention. Not requiring their judgment. The workers handle what can be learned from patterns. Your people handle what requires judgment, relationships, and the kind of context that only comes from being human and present.

Some people thrive in this shift. They were always frustrated by the routine. They come alive when the busywork clears. Some people struggle. The routine was comfortable. They built their professional identity around it. That transition is real. It takes time, support, and honesty about what's changing and why. Pretending otherwise would be dishonest.

And the nightmare every business owner knows — your key person takes vacation, things slow down; they leave, everything starts over — changes too. With AI workers, what the team teaches stays. A new person inherits a trained system and is productive in weeks.

For forty years, every technology purchase increased the organization's dependence on the people who knew how to use the technology. Knowledge lived in people and left when they did. This is the first category where the knowledge that used to evaporate now compounds — where the enabling conditions accumulate in the system instead of walking out the door.

The Shift

What Changes

From hundreds of interfaces to three.
Your Current Reality
Dynamics 365
Account Name
Acme Corp
Contact
Sarah Chen
Save & Close
Outlook
Power BI
Search
SharePoint
Name
Modified
By
Q3 Report.xlsx
Mar 1
Mike T.
Vendor List.docx
Feb 28
Sarah C.
Budget v3.xlsx
Feb 26
You
Excel Online
Vendor
Amount
Status
Acme
$2,340
Pending
Globex
$890
Paid
Teams
Export
D365 Finance
GL Account
6200-Office Supplies
Cost Center
CC-4401 Operations
Post
Admin Center
MFA Required
Auto-archive
Retention: 90d
Planner
Task
Due
Owner
Review PO
Mar 5
You
Approve inv
Mar 3
Sarah
Update GL
Mar 7
Mike
Filters
OneDrive
Contracts 2024
Folder
Contracts (OLD)
Folder
Misc
Folder
Archive
Folder
Setup Wizard
Step 3 of 5: Configure data mapping fields for import...
Word Online
Standard Operating Procedure: Invoice Processing v4.2 Draft...
Permissions
Azure Portal
Invoicing
Invoice #
INV-2024-0847
Amount
$4,205.00
Reports
Security
SSO Enabled
Password: 90d
Power Apps
Choose a connector for your data source...
Dataverse
Entity
Records
Accounts
1,247
Contacts
3,891
Intune
Compliance
App deploy
Device cfg
Bookings
Mon 9am
Setup call
Tue 2pm
Review
Wed 10am
Training
Compliance
Policy name
DLP-Finance-01
More...
Loading...
30+ screens. Every day.
Every app spawns a dozen sub-screens. None of them talk to each other.
With an AI Worker
💬
Chat
Tell it what you need
Process the Acme invoices from this week
Found 4 invoices. Matched to POs. GL codes applied from last quarter's pattern. Ready for review.
Validate
Supervise and correct
Acme #4471 — $2,340
Approve
Acme #4472 — $890 (new vendor)
Correct
Acme #4473 — $1,205
Approve
Dashboard
See what matters
94%
Auto-handled
3
Need judgment
0
Errors this week
Conversation. Validation. Dashboard. That's it.

For forty years, every technology purchase increased the organization's dependence on the people who knew how to use the technology. This is the first category where the knowledge that used to evaporate now compounds.

The Number That Never Appeared

The question underneath every evaluation

You didn't buy your house for the house.

You bought it for the Sunday morning where your kid is on the kitchen floor with crayons and you're making eggs and nobody is going anywhere. You bought the backyard where your friends are standing around the grill arguing about brisket.

But when the inspector walked through before closing, they found things the listing didn't mention. Pipes that would need replacing in three years. A drainage issue behind the foundation wall. These weren't defects serious enough to kill the deal. They were just costs the listing couldn't see — absorbed silently after purchase, called "cost of homeownership."

Every technology you have ever bought worked the same way. The listing showed features and promised outcomes. The enabling conditions the tool couldn't provide — the configuration expertise, the knowledge that walked out with departing employees, the shadow systems your team built to fill the gaps — were absorbed silently, year after year, called "cost of doing business." The people who bore those costs were your best people. They used their judgment to fill every gap between what was purchased and what the work required.

What would your people do if they didn't have to?

That is not a rhetorical question. It is the question underneath every number in every evaluation you have ever done. The spreadsheet has never been able to see it. The butterfly conservationists built a fence because their framework couldn't see grass height. You have been building fences for forty years.

The evaluation framework was designed for tools. It measured what tools offer. It measured them well.

Something arrived that the framework cannot categorize. It learns. It compounds. It accumulates the enabling conditions instead of consuming them.

But the thing that will determine whether you get the outcome or another expensive database is not which product you select. It is not which vendor you trust. It is not which row wins the spreadsheet.

It is whether you walk in asking the old question — what does this purchase give me? — or the new one: what will I need to teach, and am I willing to stay in the teaching relationship long enough for it to take?

The spreadsheet cannot see that question. But you can. And it is the only question that matters now.

The Question
What will I need to teach, and am I willing to stay in the teaching relationship long enough for it to take?
The spreadsheet cannot see that question. But you can. And it is the only question that matters now.
READY TO START?

The spreadsheet cannot see that question. But you can.

Let’s talk about what your evaluation framework isn’t measuring.
Schedule a Conversation
No pitch. No demo. Just the question that matters.