NEW! Free Chrome Extensions to audit AI Crawl Visibility and JS SEO.

Code + Page Copy Experiment

AI bot crawling experiment: text in code, page copy

Experimental Page: Content, Code, and Crawlability Signals

This page exists as part of a controlled experiment designed to evaluate how modern crawlers, search engines, and AI systems interpret on-page content alongside embedded code. The goal is not to persuade or convert, but to observe how different elements—visible copy, structured markup, and inline scripts—are discovered, parsed, and referenced during crawling and indexing.

The primary content on this page is intentionally straightforward and written in plain language. It is fully rendered in the DOM, readable without user interaction, and does not rely on client-side events to appear. This allows us to isolate how visible text content is treated relative to non-visible or machine-oriented elements included elsewhere in the page source.

In addition to the visible copy, this page includes supporting code elements for testing purposes. These may include inline scripts, structured data, or non-anchor references that are not directly user-facing. Together, the content and code on this page help evaluate how different systems reconcile human-readable text with underlying implementation details when determining relevance, context, and eligibility for citation or retrieval.

1) Copy embedded in web component attributes:
2) Copy in schema:
3) Copy in embedded API outputs (not wired to show up on the page):
4) Copy in html-formatted data (not wired to show up on the page):