Skip to main content

Author: Matt Merryfull

AI Isn’t the Future, It’s the Fillter

Temporary Concern, or the New Normal?

TL;DR

Businesses are no longer deciding whether to adopt AI — they’re deciding how quickly they can do it before someone else does. Traditional digital modernisation often failed due to high upfront cost and complexity. AI flips that — reducing resource overhead, accelerating delivery, and changing how software and strategy are approached.

But…

From due diligence to product design, AI-readiness is now a permanent evaluation lens. Companies ignoring it risk being outpaced or devalued. But moving fast without architectural discipline — a la vibe coding — introduces its own fragility.

To help teams move with confidence, I’ve developed the Blacklight 4D Framework:

Discover → Diagnose → Design → Deliver — a structured path to uncover, validate, and execute on AI-native opportunities.

📩 I work with investors, founders, and teams to navigate innovation, M&A, and strategic tech delivery. Let’s talk if you’re building, buying, or betting on the next wave.


Across industries, businesses are facing a clear fork in the road: evolve with AI, or be overtaken by those who already have. In every due diligence engagement I’ve run over the past 12 months — from payments to policy to platform ventures — AI is no longer a speculative layer. It’s a strategic constant.

What was once an exploration — “Could we use AI here?” — is now a gating condition: “Are you AI-ready enough to move forward?”

If you’re not embedding AI into your architectural thinking, operational model, and commercial roadmap, you’re preparing to compete against businesses that already have — and they’re doing it faster, leaner, and smarter.

they’re doing it faster, leaner, and smarter.

From Paper-Tiger Modernisation to AI-Native Execution

Traditional business modernisation promised leverage: digitise your systems, connect your data, unlock new markets. But it often fell flat. Expensive COTS systems, bloated middleware layers, and months of onboarding for abstract outcomes. Most companies balked at the cost, because the resource overhead and risk outweighed the perceived opportunity.

Now? AI-native strategies have flipped that dynamic.

  • Prototyping timelines have collapsed.

  • Small teams can outbuild entire departments.

  • LLM-powered workflows remove the need for excessive headcount to scale.

  • Training, automation, and deployment can be embedded with near-zero marginal cost.

The high-friction modernisation of the last decade has been replaced by modular, intent-driven, low-lift innovation — and the gap between adopters and followers is growing…really, really fast.

AI as a Due Diligence Standard, Not a Side Topic

In technical due diligence, we’ve reached a tipping point: AI-readiness isn’t just part of the review — it’s foundational.

Key questions now include:

  • Can this company scale without exploding OPEX?

  • Is the product team fluent in AI-first design and automation?

  • How resilient is the architecture under real-world LLM use?

  • Can the company defend its IP in an AI-assisted competitive field?

We’re not just looking at code or capability anymore — we’re looking at velocity, adaptability, and execution logic. We’re also applying this lens internally: our own skunkworks innovation tracks are AI-native from day zero, because anything else is slower, costlier, and harder to pivot.

The Software Shift: From Code to Cognition

This shift has deep implications for software engineering and IT leadership.

AI is not just a tool for speed — it’s transforming the structure and economics of delivery:

  • System design trumps individual code quality

  • Prompting replaces boilerplate

  • Testing and deployment are increasingly self-managed

  • Toolchains are flattening, generalist builders are accelerating

  • Cost-to-deploy is approaching zero

This isn’t just a change in toolkits — it’s a redefinition of what it means to build.

Vibe Coding and the Cognitive Gap

But here’s where the nuance creeps in — and where strategic leaders need to tread carefully.

The rise of vibe coding — where users describe what they want and AI writes the code — introduces a new kind of fragility. Yes, anyone can now generate software. But most don’t know how it works, what breaks it, or how to fix it. It’s like handing the keys to a supercar to someone who’s never driven manual.

While this lowers the barrier to entry, it also raises the floor for required system literacy. We’re heading into a world where more people can “drive” the system — but fewer understand how it’s wired underneath.

In the near future, this may be abstracted away entirely — with specialist LLMs handling fault tolerance, debugging, triage, and observability. Developers will become orchestrators, not operators. But for now? It’s a risk. One that must be assessed in any serious technical review or innovation planning cycle.

Blacklight 4D as Strategy: Build What the Business Can’t Yet Buy

We’ve formalised this into what we call Blacklight 4D (find what you cannot yet see) — a short-cycle innovation program built around AI-native tooling, modular architecture, and due diligence-grade engineering and creative disciplines.

It’s designed for companies that:

  • Need to prototype fast without overcommitting headcount

  • Want to validate innovation without legacy drag

  • Are preparing for M&A, internal restructuring, or investor scrutiny

  • Have leadership buy-in, but need execution clarity

It’s not about shiny proofs of concept. It’s about building real capability, fast, with the structural foresight needed to scale or integrate post-sprint (look to our recent hackathon for more).

Who We Help

I partner with decision-makers who see the writing on the wall and want to get ahead of it. Whether you’re preparing for a capital raise, exploring a tech acquisition, building internal capability, or modernising your stack — I help map risk, accelerate opportunity, and engineer with intent.

📩 If you’re navigating AI-readiness, due diligence, or innovation bottlenecks — let’s talk. I’m currently supporting engagements across multiple sectors.

Continue reading

Building Bridges

with Real-Time Data and Unreal Engine

TL;DR – A Hackathon Recap

We set out to prove that Unreal Engine can be more than a rendering tool—it can be a live, integrated node in a real-time digital ecosystem.

In just a few days, we:

  • Prototyped a WebSocket-based sync layer connecting UE with a web UI (Google Maps) and .NET backend using MassTransit + AWS SQS.

  • Used Cesium for UE to build a 3D twin of the real world.

  • Demonstrated two-way communication, not just visualisation—allowing interactions from and to UE.

  • Skipped auth (for now) to focus on real-time viability and cross-system collaboration.

  • Explored how SpacetimeDB’s timewarp unlocks “experiential analytics”—revisiting moments in time spatially.

We also leaned into a cross-disciplinary team model, where engineering and technical artistry collaborated closely—proof that diverse perspectives create richer solutions.

This wasn’t about shipping production code. It was about momentum toward something bigger – BRAID-like 🤓, if you will.


Foundations are everything…

At Neon Light HQ here in Sydney, we recently ran a focused internal hackathon aimed at solving a deceptively simple but expansive problem: how do you synchronise Unreal Engine (UE) with other platforms in real-time – and in a way that’s extensible, scalable, and meaningful beyond the confines of game development?

The goal was to prototype a WebSocket service that could shuttle data back and forth between UE and external interfaces, making UE not just a rendering endpoint, but a participant in a broader digital ecosystem.

We landed on a three-tiered architecture:

  • A .NET Core backend acting as an intermediary layer, built using MassTransit for message orchestration and AWS SQS for queueing and fan-out

  • A React web interface that displayed contextual overlays via Google Maps.

  • A Cesium for UE setup rendering a rich 3D digital twin of the real world (this is almost trivial these days – so big thank you to the team over at Cesium).

The intent? If a user clicks something in the web interface, a WebSocket event fires to UE. UE responds with spatial context or 3D metadata. And just as crucially, if UE detects a spatial interaction (e.g. an object selected in-world), that event fans out to web dashboards, logs, and notification systems.

This may sound modest, but the core principle flips a common pattern on its head. Most integrations (Bentley Systems, for instance) are read-only. Data flows into the visual system but not out. We’re proving that the loop can – and should – close.

It’s not that these systems don’t have the capability, the desire just hasn’t been there, until now.


Why It Matters

Most of today’s visual-based workloads – spreadsheets, reports, PDFs – exist in ecosystems that sit around spatial engines, not within them. And while tools like UE Datasmith help ingest content into Unreal, they don’t help facilitate collaboration or insight generation from inside the experience – realistically, that’s not what Datasmith or UE was designed to do.

We believe that real value comes when spatial platforms become expressive interfaces—not just canvases.

Think: stakeholder walkthroughs that generate insights, not just impressions. Engineers observing user focus patterns. Designers iterating based on behaviour, not assumptions. Expand this use case through to future governance and the digital twin interface and you’ll see where our team’s collective minds are travelling toward.

This is why one of our next moves is incorporating SpacetimeDB, a time-aware database that unlocks ‘timewarp’ capabilities. Users and systems will be able to query what was happening, who was there, and what was seen at any point in the spatial timeline. It’s experiential analytics without the friction—impressions captured passively, insight drawn actively.

In the video, we see the world coordinate data (Latitude and Longitude) being synchronised from UE through to Google Maps. Towards the end, the ability to “warp” to different locations is captured as well – enabling new interaction paradigms not possible in previous experiential delivery.

On Security, Teamwork, and Realities

In the interest of velocity, we excluded an authentication layer. Not because it’s not important – it is, especially for security and multi-tenant setups – but because the goal of this hackathon wasn’t polish, it was potential. We know auth is a critical next step for any real-world deployment.

Equally critical to the hackathon’s success was our interdisciplinary team. Neon Light’s DNA isn’t just code; it’s artistry, engineering, experience, and storytelling. In this sprint, we saw technical artists collaborate with engineers, ops folks challenge assumptions, and designers stretch the boundaries of what the toolset was originally built for. That shared intent – the idea that collaboration is the actual outcome – was more valuable than any single feature we shipped.

Looking Ahead

As we step back from this experiment, it’s clear we’ve only scratched the surface. The addition of pub/sub queues – using MassTransit via C# and .NET with AWS SQS – enables a fan-out approach to event handling, allowing decoupled services like notifications, AI processes, reporting, and data-lakes to react asynchronously to system activity. This de-centralised approach offers scalable pathways to expand workloads and capabilities without overloading the core systems.

Equally compelling is the opportunity presented by SpacetimeDB’s timewarp feature. It introduces a novel concept in experiential analytics: the ability to revisit specific moments in a shared 3D environment and extract insights without interrupting or distorting the original user experience. Imagine stakeholders being able to explore what was viewed, when, why, and for how long – without intrusive data capture or forced interactions. It’s a subtle but powerful shift: analytics that respect the flow of experience while enabling deep reflection later.

a powerful shift: analytics that respect the flow of experience while enabling deep reflection later.

This prototype, while small in scope, is a key step toward a broader connected ecosystem. While we held off implementing authentication for now – given the short hackathon window – we fully recognise its role in enabling secure and scalable infrastructure for real-world deployment. Similarly, the discussions and cross-domain collaboration that fuelled this build are just as important as the technical outputs. The blending of software engineering and technical artistry created a feedback loop of ideas that shaped not only what we built, but why we built it.

We’re treating this not as a standalone exercise, but as a foundational thread in a broader tapestry – one that will weave into larger platform ambitions. This includes improved interoperability, new collaborative workflows, and real-time digital experiences that extend across disciplines and industries. As we refine these concepts and begin incorporating persistent state layers, we’re opening the door to meaningful partnerships, scalable implementations, and new ways of engaging with the digital world.

Continue reading