URL Inspection is a Google Search Console tool that returns a detailed, per-URL report on how Google sees a specific page. It is the most authoritative way to find out whether a URL is in Google's index, what version Google has stored, and why a page might not be ranking.
What URL Inspection reports
For any URL on a verified property, the tool returns:
- Index status: Whether the URL is currently in the Google index
- Last crawl date: When Googlebot most recently fetched the page
- Crawled-as user agent: Which crawler (mobile or desktop) Google used
- Canonical assignment: Both the user-declared canonical and the canonical Google actually selected
- Mobile usability: Whether the page passes Google's mobile-friendly checks
- Structured data validation: Detected schema markup and any errors
- Page resources: CSS, JavaScript, and image fetch results
- Blocking issues: noindex tags, robots.txt blocks, soft 404s, redirect errors, server errors
Live test vs index version
URL Inspection offers two views. The default shows the version of the page Google has in its index, which can be days or weeks old. The "Test live URL" button fetches the current version in real time, useful for confirming a fix has actually been deployed before requesting reindexing.
URL Inspection API
Google also exposes URL Inspection as an API endpoint, allowing automated systems to pull the same per-URL report at scale. The API is rate-limited per property, which makes it well suited to monitoring a curated list of high-value pages rather than crawling an entire site.
How VitalSentinel handles this
Manually checking URL Inspection one page at a time does not scale past about a dozen URLs. VitalSentinel's Indexing Monitoring uses Google's URL Inspection API to track coverage on the URLs you actually care about, alerts you the moment a page gets dropped, and surfaces the exact reason - noindex, robots block, canonical change, or soft 404 - so you can fix the cause instead of guessing.
Related Terms
Google Search Console
A free tool from Google that helps website owners monitor, maintain, and troubleshoot their site's presence in Google Search results.
Indexing
The process by which search engines store and organize web content so it can be retrieved and displayed in search results.
Sitemap
A file that lists all the URLs of a website that should be indexed by search engines, helping crawlers discover content.
Web Crawler
An automated program that systematically browses the web to discover and index content for search engines.