What Almond Milk Taught Me About Construction Technology

Perfect Storm blog post image

Your weekly dose of tech insight for Arizona's builders

I was scrolling through social media last week when a post stopped me mid-thumb.

A woman had lined up five different types of milk on her kitchen counter: oats, cashew, peanut, almond, and soy, each one labeled, each one poured into its own glass. She was taste-testing them all, side by side, with a full breakdown of the health benefits, the concerns, and a final ranking based on actual data.

Not opinions. Not whatever her friend recommended. Not what the label promised.

A real, structured, side-by-side comparison.

And I thought: Why don’t more contractors do this with their technology?

The Gut-Feel Trap

Here’s what I see every week in this industry.

A superintendent hears about a new scheduling app at a trade show. Sounds great. The demo looked clean. The sales rep knew all the right words. So the company buys it. Rolls it out. And three months later, it’s sitting unused on everyone’s phone like a gym membership in February.

Nobody compared it to what they already had. Nobody ran it alongside the existing process to see which one actually performed better. Nobody tested it.

They went with their gut.

And look, I get it. Construction is a gut-feel industry. You’ve built a career on instinct, experience, and knowing when something doesn’t look right on a jobsite. That’s earned. That matters.

But when it comes to technology decisions, your gut is expensive.

McKinsey’s research on construction found that firms who embrace digital tools and a data-driven culture are 50% more likely to finish projects on time and within budget. Meanwhile, PwC reported that data-driven organizations are three times more likely to see significant improvements in decision-making.

The gap between “we think” and “we know” is where your margin lives.

So What Is A/B Testing, Exactly?

A/B testing is just a fancy term for what that woman did with her milk.

You take two options: Option A and Option B, and you run them side by side under the same conditions. Then you measure the results. Not with opinions. With data.

In the digital world, 77% of companies worldwide now run A/B tests on their websites. It’s how Netflix decides which thumbnail you see. It’s how Amazon decides where to put the “Buy Now” button. It’s how every major tech company makes decisions, not based on what their smartest person thinks, but based on what actually works.

The A/B testing software market hit $1.5 billion in 2025 and is projected to reach $4.4 billion by 2035. This isn’t a trend. It’s how modern businesses operate.

And there’s no reason your construction company can’t think the same way.

What A/B Testing Looks Like on a Jobsite

You’re not going to set up a laboratory in your trailer. But you can absolutely adopt the mindset. Here’s what it looks like in practice:

  1. The Daily Report Test
    Your field crews currently fill out paper daily reports. Your PM has been pushing for a digital daily report app. Instead of switching everyone overnight, run both systems on one project for 30 days.

    Measure: How long does each method take? Which one has more complete data? Which one gets submitted on time? Which one actually gets read by the PM? At the end of 30 days, you don’t have an opinion. You have an answer.
  2. The RFI Response Showdown
    On Project A, RFIs go through the traditional email chain: field to PM to architect and back.
    On Project B, RFIs go through a centralized platform with automated routing and tracking.

    Measure: Average response time. Number of RFIs that “fell through the cracks.” How many required follow-up because the original response was incomplete.

    One of our clients ran this exact comparison and found that platform-routed RFIs were resolved 40% faster; not because anyone worked harder, but because nothing got lost in an inbox.
  3. The Onboarding Experiment
    You just bought Procore. Or Autodesk Build. Or whatever platform you’re rolling out this year. Instead of one company-wide training day and a prayer, try two approaches:

    Team A gets the traditional classroom-style training. Team B gets a structured 15-minutes-per-day guided workflow over two weeks - learn by doing on their actual project data.

    Measure: Feature adoption after 60 days. Help desk tickets. Who’s actually using the software versus who reverted to spreadsheets.

    We see this constantly as an MSP. The training method matters more than the software itself. 77% of contractors who fully adopt their technology see higher margins. But adoption doesn’t happen by accident. It happens by testing what actually works for your team.
  4. The Communication Channel Test
    You need subcontractors to confirm schedule changes within 24 hours. Test two methods: Group A gets an email notification. Group B gets a text message with a simple “Confirm Y/N” link.

    Measure: Response rate within 24 hours. I’ll save you the suspense, the text wins almost every time. But the point isn’t the answer. The point is you measured it instead of assuming.

The MSP Parallel: We A/B Test Everything

I’ll be transparent, we do this in our own business.

When we’re evaluating a new security tool for our clients, we don’t just trust the vendor’s pitch deck. We pilot it alongside the existing solution. We compare detection rates, false positives, user impact, and actual cost-per-endpoint.

When we roll out a new backup and disaster recovery platform, we test recovery times head-to-head. Vendor A says “15-minute recovery.” Great. Prove it. We run a simulated recovery on both platforms and time it.

When we’re onboarding a new client, we’ve tested different implementation sequences: start with email migration vs. start with security hardening vs. start with endpoint management, to see which approach leads to the smoothest transition and fastest user adoption.

The principle is the same whether you’re choosing between almond milk and oat milk, or between two project management platforms. Don’t trust the label. Taste it yourself.

The “We’ve Always Done It This Way” Tax

Here’s the part nobody talks about. Every process you haven’t tested is costing you something. You just don’t know how much. McKinsey found that 80% of large construction projects exceed their budget by 20% or more. And a huge contributor is what they diplomatically call “cognitive bias and flawed assumptions,” which is a McKinsey way of saying, “You guessed, and you guessed wrong.”

Construction operates on margins of 2–7%. In a business where a 10% cost overrun on a bid can turn a profitable project into a money-loser, the difference between “we think this works” and “we tested this and it works” isn’t academic. It’s survival.

The industry that built the Hoover Dam still makes most of its technology decisions the same way you pick a restaurant on vacation: someone heard it was good.

That’s the “We’ve Always Done It This Way” tax. And you’re paying it on every project where you haven’t tested your assumptions.

How to Start: The 3-Test Challenge

You don’t need a data science team. You don’t need expensive software. You just need the discipline to compare before you commit.

Here’s my challenge to you: Pick three things in your business and A/B test them over the next 90 days.

Not sure where to start? Here are candidates that almost always reveal something surprising:

  • Your communication method for schedule changes: email vs. text vs. platform notification
  • Your new-hire technology onboarding: classroom training vs. guided daily walkthroughs
  • Your document management approach: current system vs. that platform your PM keeps asking about
  • Your estimating workflow: manual takeoff vs. AI-assisted takeoff on the same set of plans
  • Your meeting cadence: weekly 60-minute project meetings vs. daily 15-minute standups

The rules are simple: Run both options simultaneously. Measure the same things for both. Give it enough time to be meaningful (usually 30–60 days). And then let the data make the decision.

The Almond Milk Lesson

That woman on social media didn’t just pick the milk with the best marketing. She didn’t ask her neighbor. She didn’t go with whatever was on sale. She lined them all up. She tested them. She measured the results. And then she made an informed decision.

Your construction company deserves the same approach to every technology decision, every process change, and every workflow improvement.

Stop guessing. Start testing. The data doesn’t care about your gut and your margins will thank you for it.



We’re Running Our Own A/B Test And We Need 2 Minutes

We just spent an entire blog telling you to stop guessing and start measuring. So we’re putting our money where our mouth is.

We’re building the 2026 Arizona Construction Tech Stack Snapshot: a real-world look at what Arizona’s builders are actually using, where the frustrations are, and how your tech stack compares to the rest of the industry. Not vendor rankings. Not G2 reviews. Real feedback from real contractors.

4 questions. All multiple choice. Takes about 2 minutes. We’ll publish the results in a future TechTip so you can see how your approach stacks up against the rest of Arizona’s AEC community.

Take the 2-Minute Tech Stack Survey
(Your responses are anonymous. We’ll only use aggregate data in the published results.)



Ready to stop guessing and start testing? Let’s talk about where technology decisions are costing you and how a structured approach can turn your IT from an expense into an advantage.

Ready to stop guessing and start testing? Let’s talk about where technology decisions are costing you and how a structured approach can turn your IT from an expense into an advantage.

Book a Free Consultation Review with Computer Dimensions.

For over 20 years, Computer Dimensions has been the trusted IT partner for Arizona's architecture, engineering, and construction industry. We help AEC firms communicate better, collaborate smarter, and actually use the technology they've invested in. Because in construction, the tools only work if your team does.

IT Built For Builders.

P.S. If you’ve been “meaning to look into” a new tool, platform, or process for your company but haven’t pulled the trigger, that hesitation is actually smart. The next step isn’t buying it. The next step is testing it. Give us a call and we’ll help you set up a real comparison so you can make the decision with confidence, not hope.

 


Jack Enfield

About the Computer Dimensions Blog

This online digest is dedicated to exploring information, solutions and technology relevant to small and mid-sized businesses and organizations.

Content is brought to you by Computer Dimensions, a Tucson IT company that has been providing trusted technology service and solutions since 1995.

Visit Computer Dimensions

Blog Archive

Excel Tips
Managed IT Services
Computer Support and Services
Cyber Security and Compliance
Backup and Disaster Recovery
Custom Programming and Software Development
Company News


Call Us Today (520) 743-7554