The moment the question changed

For years, the question was simple: can people find us on Google?

It was a clean question. Search Engine Optimization was the answer, and the playbook was well understood — keywords, backlinks, technical health, content depth, schema. You could run an audit, fix what broke, and watch the rankings move.

But over the last eighteen months, that question stopped being enough.

The new question — the one I started hearing from clients, from operators in my network, and eventually from my own analytics — was different:

When someone asks an AI system about the problem we solve, are we cited — or invisible?

That question doesn’t have a Google answer. It doesn’t have an SEO answer. And until recently, I didn’t have the kind of diagnostic I wanted: one that could show what AI could access, understand, extract, and cite — and what to fix next.

That’s why I built AIVZ.

SEO still matters. It just doesn’t answer enough.

Let me say this directly, because it gets misquoted: SEO is not going away.

Discoverability still rules. If Google can’t crawl your site, AI systems probably can’t either. If your structured data is broken, neither rankings nor citations will save you. The technical foundation matters more, not less.

But there’s a second layer on top of that foundation, and SEO doesn’t measure it.

SEO asks: can this page be discovered and ranked?

AEO asks: can this answer be extracted, trusted, cited, and reused by AI?

Answer Engine Optimization — AEO — is the citation-and-retrieval layer. It sits on top of SEO. It depends on the same technical health that SEO has always rewarded, but it adds requirements that didn’t exist before:

  • Can the answer be extracted cleanly, without misinterpretation?
  • Is the entity behind the page clearly defined and machine-readable?
  • Are the trust signals — authorship, sourcing, freshness — present in a form an AI system can verify?
  • Is the content structured in ways AI systems can lift answers from directly?
  • Does the surrounding authority — endorsements, citations, network position — support being chosen over the next source in line?

When AI systems decide who to cite, they’re not running PageRank. They’re running something closer to a synthesis question: of the sources that could answer this, which one is most likely to be both correct and defensible?

The companies that win the next decade of search will be the ones whose content reliably answers yes to that question. Authority, in this context, is recognized trust — and it has to be visible, verifiable, transferable, and machine-readable.

Attention still gets you seen. Trust determines whether you are selected, cited, recommended, or referred.

I was already building pieces of this problem.

I didn’t start with AIVZ. It’s actually my third move into this space.

Years before “AEO” became a term anyone argued about, I built Guestify — a platform for systematic podcast guesting. Guestify exists because the fastest way to transfer authority from someone trusted to someone new is an interview. A host vouching for a guest is a recognizable signal — to humans, and increasingly to machines reading transcripts, show notes, and the citation graphs that connect them.

Guestify builds off-site authority through interviews and relationships. That’s one half of the problem.

Then I built AnswerEngineWP — a WordPress-native plugin that scans a single URL against twenty-seven AEO factors, applies the fixes it can apply locally, and flags what needs a deeper pass. AnswerEngineWP handles the on-site half of one execution surface: making WordPress content extractable, structurally clean, and AI-readable.

Both products were built around the same problem from different sides: off-site trust signals on one side, on-site extractability on the other.

But neither one answered the bigger question I kept running into:

Across an entire site, across the AI systems that matter, what’s actually broken — and what’s the right order to fix it?

That’s AIVZ.

What AIVZ actually does.

AIVZ is the AI visibility platform I wish I had when this question first started showing up.

It scans your site against a 93-factor AEO taxonomy and computes a score across three layers: Access (can AI systems crawl your content), Understanding (can they parse what’s there), and Extractability (can they lift clean answers out of it). Then it produces a prioritized fix list — what to do first, what depends on what, and where the biggest wins are sitting.

AIVZ helps reveal where visibility differs across AI search and answer systems, because being visible in one environment does not mean you are visible everywhere.

For agencies, AIVZ ships with multi-tenant architecture and white-label reporting. The operators most likely to need this first are the ones already running SEO, content, and PR retainers for clients who are starting to ask the question I opened with — “why aren’t we showing up in AI Overviews?” — and the agencies that can answer it clearly will have a stronger case for keeping and expanding those retainers.

On WordPress, AIVZ executes through AnswerEngineWP. Other surfaces are coming.

AIVZ is not a replacement for SEMrush or Ahrefs. It runs alongside them. It is not a content generator pretending to be an analytics tool. It is a diagnostic and orchestration layer for the AI visibility work most SEO tools were not built to measure. The 93-factor taxonomy it runs on is the framework. The product is what the framework lets you do with it.

AIVZ does not guarantee citations. No tool can. What it can do is measure the signals AI systems actually read, surface the gaps, and tell you what to fix to improve your odds of being the source they choose.

So I ran AIVZ on AIVZ.

Before asking anyone else to trust the scanner, I ran it on our own site.

Not because the site was perfect — it isn’t. Because that’s the point. AI visibility is not a slogan. It’s something you can measure, diagnose, and improve. A tool that won’t survive its own scan has no business being shipped, and a founder who won’t publish their own gaps has no business asking customers to trust theirs.

Here’s what came back.

Current AI Visibility Score
[INSERT SCORE]

Top strengths

  1. [INSERT STRENGTH]
  2. [INSERT STRENGTH]
  3. [INSERT STRENGTH]

Top gaps

  1. [INSERT GAP]
  2. [INSERT GAP]
  3. [INSERT GAP]

What we’re fixing next

  1. [INSERT FIX]
  2. [INSERT FIX]
  3. [INSERT FIX]

I’m publishing the gaps deliberately. The credibility of a measurement tool comes from being willing to be measured by it. As we ship fixes, the score will move, and we’ll publish updates.

Any serious tool in this category should be willing to measure itself. That is the standard I want AIVZ to meet from day one.

The bet.

Here’s what I think is true.

The next decade of search will not only reward the best content. It will reward the clearest, most trusted, most extractable, most citable expertise. The companies that show up in AI answers will not be the ones who shouted loudest. They’ll be the ones whose authority was recognizable to a machine — visible, verifiable, transferable, and structured for retrieval.

That’s the shift Guestify was built for on the relationship side. That’s the shift AnswerEngineWP was built for on the WordPress side. And that’s what AIVZ measures and fixes across everything else.

If you’re an operator looking at your own analytics and feeling the same thing I started feeling — that the old questions don’t quite cover the new ground — there’s a free way to find out exactly where you stand.

Run a scan on your own site. See what your page makes available to AI. See what is missing.

Then decide what to do about it.

TG

Tony Guarnaccia

Tony Guarnaccia is a marketing operator and founder. He built Guestify and AIVZ and teaches the underlying frame: in AI-mediated discovery, recognized trust is what gets you cited.