Component 02

Accessibility

Onelo

Accessibility is the one component of Onelo’s organic growth engine that requires no remediation. The technical infrastructure is sound, well-maintained, and above category average for a Series B company at this stage. All 16 signals pass. One minor forward-looking issue is flagged: three orphan pages that should be resolved before they cause index bloat. No other action is required at this layer.

Signals rated Healthy are documented concisely — the finding is clean and the evidence is straightforward. Three signals with minor observations (orphan pages, UR gap, missing hreflang) receive fuller treatment because they carry a forward-looking recommendation. The healthy signals are documented for completeness and as a baseline for future diagnostic cycles.

Signals Assessed

Blocking signals

Fragile signals

Healthy signals

This document covers all 16 signals in the Accessibility component. For each signal, you will find: what was assessed and why it matters, the specific findings for Onelo, evidence supporting those findings, and the recommended intervention. 

Signal Assessment

A signal is a subcomponent of any of the ten layers that make up an organic growth engine. Each signal is assessed thoroughly following our methodology and assigned a status: Healthy, Fragile, Blocking, or Missing. For each signal, there is supporting evidence and recommendations for how to turn each signal healthy. 

Layer Conclusion

Accessibility is fully healthy. All 16 signals pass. The technical foundation is sound and well-maintained — above average for a Series B company operating with a 1.5 FTE organic growth function and an external agency relationship.

The primary contributor to this result is the choice of Webflow as the CMS. Webflow generates server-side rendered HTML by default, eliminates render delta risk, and provides a correctly configured CDN out of the box. The engineering and content teams have maintained the configuration correctly: no crawl traps, no duplicate content, no canonical conflicts, no index bloat, and clean HTTP status codes across all 851 submitted pages.

3 Forward-looking Housekeeping

The following items are not current problems. They are minor observations that, if not addressed, could create problems over the next 12–18 months as the content programme scales and the UK and Canadian expansion content is built.

L02 — Three Forward-Looking Housekeeping Items
Item Current status Risk if unaddressed Action required Timeline
Three orphan pages indexed with no internal links Indexed but unreachable via internal navigation Slow accumulation of orphan pages creates index quality dilution over time Add one internal link to each from the most relevant existing page Week 1 — 15 minutes
Four staging page URLs in sitemap Submitted in error, now excluded from sitemap Minor crawl budget waste until GSC recrawl completes Remove from sitemap, request GSC recrawl Week 1 — 15 minutes
No hreflang tags on any page Correct for a single-market site today When UK and Canadian content is built, hreflang must be added from day one to avoid indexation confusion Implement hreflang tags when first locale-specific pages are published At time of international content build

01. Robots.txt Directives

Healthy

What this signal assesses

The robots.txt file tells crawlers which parts of the site they are and are not permitted to access. Misconfigured robots.txt rules are one of the most common causes of silent indexation failures — pages that appear to be published but are invisible to search engines because a crawl rule is blocking them.

Findings

Onelo’s robots.txt is correctly configured. Admin, staging environment, and URL parameter-based paths are correctly excluded. All commercial, content, and navigational pages are crawlable. No commercial pages are blocked. The file follows current best practice with a sitemap reference at the bottom pointing to the sitemap index file.

L02 Signal 01 — Evidence #1
Directive type Configuration Commercial pages blocked? Assessment
Admin / backend paths Disallowed No Correct
Staging environment paths Disallowed No Correct
URL parameter paths Disallowed No Correct
Commercial pages (/product/, /solutions/, /pricing) Allowed No Correct
Blog content (/blog/) Allowed No Correct
Sitemap reference Present — points to sitemap index N/A Best practice

Robots.txt configuration audit

RECOMMENDATION

No action required. Monitor robots.txt at each CMS update or site architecture change to confirm no unintended directives are introduced.

02. Crawl Trap Detection

Healthy

What this signal assesses

Crawl traps are site structures that cause search engine bots to crawl an infinite or extremely large number of URLs — typically generated by faceted navigation, session parameters, infinite scroll, or calendar-based URL generation. They waste crawl budget and can prevent important pages from being crawled regularly.

Findings

No crawl traps detected. Onelo does not use infinite scroll on any indexed page. Faceted navigation exists on the integrations directory page but is implemented using JavaScript rendering that correctly suppresses parameter-based URL generation. The blog archive uses standard pagination with rel=prev/next markup.

L02 Signal 02 — Evidence #1
Site feature Trap risk Implementation Assessment
Integrations directory (faceted) High if misconfigured JavaScript rendering suppresses URL parameter generation No trap
Blog pagination Medium if misconfigured Standard rel=prev/next markup, no parameter explosion No trap
Infinite scroll High if present Not used on any indexed page No trap
Calendar / date-based URLs Medium if present Not present on site No trap
Session parameters High if misconfigured Excluded via robots.txt and canonical tags No trap

Crawl trap risk assessment — key site features

RECOMMENDATION

No action required. If future feature development introduces faceted filtering, calendar views, or infinite scroll to any indexed page, verify crawl trap prevention before releasing to production.

03. URL Parameter Handling

Healthy

What this signal assesses

URL parameters (e.g. ?utm_source=, ?ref=, ?page=) can generate thousands of duplicate URLs that consume crawl budget and dilute the authority of the canonical page. This signal assesses whether parameter-generated URLs are correctly handled through robots.txt exclusion, canonical tags, or both.

Findings

URL parameters are correctly managed. Tracking parameters (UTM variants) are excluded from crawling via robots.txt and are also covered by canonical tags pointing to the clean URL. Pagination parameters on the blog archive use correctly implemented rel=prev/next markup. No duplicate content issues arising from parameter variants were detected in the index.

L02 Signal 03 — Evidence #1
Parameter type Example Control mechanism Duplicate content risk Assessment
Tracking (UTM) ?utm_source=newsletter robots.txt exclusion + canonical to clean URL None Correct
Referral ?ref=partner robots.txt exclusion + canonical None Correct
Pagination ?page=2 rel=prev/next markup on blog archive Managed Correct
Session IDs ?session=... Excluded via robots.txt None Correct

URL parameter handling audit — parameter types and controls

RECOMMENDATION

No action required. When new tracking parameters or filtering mechanisms are added to the site, ensure they are covered by either robots.txt or canonical tags before deployment.

04. Core Page HTTP Status Codes

Healthy

What this signal assesses

HTTP status codes determine whether a page can be indexed. A page returning a 4xx error cannot be indexed. A redirect chain longer than one hop loses link equity at each hop. This signal assesses whether all commercial and content pages return clean 200 responses and whether redirect architecture is efficient.

Findings

All 851 submitted URLs return HTTP 200. No 4xx errors on commercial or content pages. Redirect audit identified 14 redirects across the site, all are single-hop (301) with no chains. The redirect destinations are all correctly canonical pages. No redirect loops detected.

L02 Signal 04 — Evidence #1
Status code Count Page types affected Assessment
200 OK 851 All commercial, content, and navigational pages Correct
301 Permanent Redirect 14 Legacy URL variants — all single-hop to canonical destinations Correct — no chains
4xx Client Error 0 None None detected
5xx Server Error 0 None None detected
Redirect chains (2+ hops) 0 None None detected
Redirect loops 0 None None detected

RECOMMENDATION

No action required. Monitor status codes after any CMS migration, URL restructuring, or platform change.

05. Indexable Page Ratio

Healthy

What this signal assesses

The indexable page ratio measures what proportion of submitted URLs are successfully indexed by Google. A ratio below 90% typically indicates a structural issue — misconfigured canonicals, noindex tags, or crawl budget constraints preventing full indexation. For most sites, a ratio above 95% is healthy.

Findings

847 of 851 submitted URLs are indexed — a 99.5% indexable page ratio. The 4 unindexed URLs are staging environment pages that were submitted via a now-corrected sitemap configuration error. No content pages, commercial pages, or blog posts are missing from the index.

L02 Signal 05 — Evidence #1
Category Submitted Indexed Ratio Assessment
Commercial pages 15 15 100% All indexed
Blog posts 94 94 100% All indexed
Other pages (about, contact, etc.) 738 733 99.3% 4 staging pages excluded
TOTAL 851 847 99.5% Healthy — above 95% threshold

Indexable page ratio — submitted vs indexed

The 4 unindexed staging pages are the only gap in an otherwise complete index. The sitemap configuration error that caused them to be submitted has been corrected. Remove the four staging page URLs from the sitemap and submit a sitemap recrawl request in GSC.

[Link to spreadsheet: GSC — Coverage report — Indexed tab — export all indexed URLs — cross-reference with submitted sitemap URLs to identify any gaps — separately export the ‘Excluded’ tab to confirm the 4 unindexed URLs are staging pages only]

RECOMMENDATION

Remove the 4 staging page URLs from the sitemap and request a GSC coverage recrawl. This is a 15-minute task. After removal, confirm the Excluded count in the Coverage report drops by 4 within the next crawl cycle (typically 2–4 weeks).

06. Canonical Tag Correctness

Healthy

What this signal assesses

Canonical tags tell search engines which version of a URL is the definitive one. Incorrect canonical implementation is one of the most common causes of duplicate content issues, split link equity, and indexation confusion. A canonical that points to the wrong URL, or a conflicting canonical across page variants, can silently undermine rankings.

Findings

Canonical tags are correctly implemented across all page types. Self-referencing canonicals are present on all unique pages. Paginated content correctly uses canonical tags pointing to the first page of the series rather than each individual page. No conflicting or missing canonical tags were detected in the crawl.

L02 Signal 06 — Evidence #1
Page type Canonical implementation Conflicts detected? Assessment
Homepage Self-referencing canonical to onelo.com/ No Correct
Product pages Self-referencing canonical to page URL No Correct
Solution pages Self-referencing canonical to page URL No Correct
Blog posts Self-referencing canonical to post URL No Correct
Blog pagination (page 2, 3...) Canonical points to page 1 of series No Correct
Tag / category archives noindex — no canonical required No Correct

Canonical tag audit — key page types

RECOMMENDATION

No action required. Verify canonical implementation whenever new page templates are deployed or the site architecture is restructured.

07. Duplicate Content Clusters

Healthy

What this signal assesses

Duplicate content occurs when the same or substantially similar content exists at multiple URLs. It dilutes link equity, creates indexation uncertainty, and in significant cases can trigger quality penalties. This signal looks for both exact and near-duplicate content clusters that would confuse search engine indexation decisions.

Findings

No significant duplicate content clusters were detected. Blog category and tag archive pages — a common source of near-duplicate content — are correctly configured with noindex tags. The blog pagination does not generate duplicate page-level content because each paginated page has a meaningfully different set of posts. No near-duplicate page pairs were identified in the content crawl.

L02 Signal 07 — Evidence #1
Duplicate source Present? Control mechanism Duplicate content detected?
Blog category archive pages Yes noindex tags on all archive pages No
Blog tag archive pages Yes noindex tags on all tag pages No
Paginated blog pages Yes Canonical + rel=prev/next — distinct content per page No
URL parameter variants Yes (UTM etc.) robots.txt + canonical — Signal 03 No
Near-duplicate page pairs Not detected N/A No

Duplicate content risk assessment — common sources

RECOMMENDATION

No action required. If blog content volume increases significantly or new content types are introduced, re-audit archive page configuration.

08. Soft 404 Detection

Healthy

What this signal assesses

A soft 404 is a page that returns a 200 HTTP status code but displays a ‘page not found’ or empty content message to the user and to Googlebot. Search engines waste crawl budget on soft 404s and may eventually deindex them. They are particularly common on e-commerce and SaaS sites with dynamic URL generation.

Findings

No soft 404s detected. All pages returning 200 contain substantive content. Error handling on the site correctly returns 404 status codes for non-existent URLs rather than displaying a 200-status error page.

L02 Signal 08 — Evidence #1
Test case URL tested Response Content returned Assessment
Non-existent product URL /product/does-not-exist 404 status code Standard 404 error page Correct
Non-existent blog post /blog/does-not-exist 404 status code Standard 404 error page Correct
Deleted page (redirected) Legacy deleted URL 301 to canonical No 200 returned Correct
GSC soft 404 report Checked via Coverage report 0 soft 404s reported All 200s contain content Correct

Soft 404 check — key test cases

RECOMMENDATION

No action required. Verify soft 404 handling when new CMS templates, landing page builders, or dynamic URL patterns are introduced.

09. Index Bloat Assessment

Healthy

What this signal assesses

Index bloat occurs when a site has significantly more pages indexed than it has meaningful content — typically caused by auto-generated thin pages, tag archives, author pages, or parameter variants. Bloated indexes dilute site quality signals and reduce crawl budget efficiency.

Findings

851 indexed pages across a site of Onelo’s size and content volume is appropriate. There is no evidence of index bloat. Thin pages such as tag archives, empty category pages, and author pages are correctly noindexed or excluded from the sitemap. The indexed page count reflects genuine, substantive content.

L02 Signal 09 — Evidence #1
Page category Indexed count Substantive content? Assessment
Commercial pages (product, solution, pricing) 15 Yes Correct
Blog posts 94 Yes Correct
Navigational pages (homepage, about, contact) ~12 Yes Correct
Supporting pages (integrations, legal, etc.) ~726 Yes Correct
Tag / category archives 0 noindexed — excluded Correct
Author pages 0 noindexed — excluded Correct
Parameter variants 0 Blocked — canonical / robots.txt Correct

Indexed page composition — 851 total indexed pages

Note: there are three orphan pages currently indexed that have no internal links pointing to them. These are not yet causing bloat, but orphan pages accumulate over time and should be resolved proactively. See the Layer Conclusion for the housekeeping recommendation.

RECOMMENDATION

No current action required. The three orphan pages are flagged as a forward-looking housekeeping item in the Layer Conclusion.

10. JavaScript Render Delta

Healthy

What this signal assesses

Many modern sites rely on JavaScript to render content — meaning the HTML delivered to a browser is empty until JavaScript executes and populates the page. Search engine crawlers can render JavaScript, but there is often a gap between what a user sees and what a crawler sees if rendering is incomplete or delayed. This signal compares the rendered version of key pages against the raw HTML to identify any content that is invisible to crawlers.

Findings

JavaScript rendering was tested on the homepage, three product pages, two solution pages, and three blog posts using Googlebot’s rendered output via Google Search Console’s URL Inspection tool. In all nine cases, the rendered content matched the user-visible content with no meaningful delta. All commercial page copy, headings, and CTAs are present in the rendered output. The site is built on Webflow, which generates server-side rendered HTML by default — this is the primary reason the rendering is clean.

L02 Signal 10 — Evidence #1
Page tested Page type Content in raw HTML? Content in rendered output? Render delta?
/home Homepage Yes Yes None
/product/onboarding-automation Product Yes Yes None
/product/workflow-builder Product Yes Yes None
/product/integrations Product Yes Yes None
/solutions/mid-market-onboarding Solution Yes Yes None
/solutions/remote-teams Solution Yes Yes None
/blog/employee-onboarding-checklist Blog post Yes Yes None
/blog/remote-onboarding-best-practices Blog post Yes Yes None
/blog/onboarding-best-practices-2024 Blog post Yes Yes None

JavaScript render delta test — 9 pages tested

Each page tested via GSC URL Inspection — rendered HTML compared against user-visible content. A meaningful delta would appear as missing headings, missing body copy, or missing CTAs in the rendered output.

Webflow’s server-side rendering ensures that all content is present in the raw HTML before JavaScript executes. This is structurally advantageous: there is no rendering dependency that could create a delta if JavaScript execution is delayed or incomplete.

RECOMMENDATION

No action required. If the site migrates away from Webflow or introduces client-side rendered components (React, Vue, etc.) for commercial page content, retest render delta as a priority.

11. Server Latency and Timeout Patterns

Healthy

What this signal assesses

Server response time affects both user experience and crawl efficiency. Slow server responses mean Googlebot receives fewer pages per crawl session, reducing crawl frequency over time. Timeout patterns — where the server fails to respond within the crawler’s patience window — can cause pages to be dropped from the crawl queue entirely.

Findings

Average server response time: 180ms. This is well within the threshold where crawl efficiency begins to be affected (typically above 500ms). No timeout patterns detected in the crawl log analysis. Server performance is consistent across the US-East, US-West, and EU locations tested. Webflow’s CDN infrastructure is performing correctly.

L02 Signal 11 — Evidence #1
Test location Average TTFB Threshold (crawl impact) Assessment
US-East 162ms >500ms Well within threshold
US-West 181ms >500ms Well within threshold
EU 198ms >500ms Well within threshold
Average across all locations 180ms >500ms Healthy

Server latency measurements — three geographic test points

TTFB: Time to First Byte, measured from request initiation. The 180ms average is 64% below the threshold at which crawl efficiency begins to degrade.

RECOMMENDATION

No action required. Monitor server latency quarterly. If the site’s infrastructure changes (CMS migration, hosting change, significant traffic increase) retest latency from all three locations before and after the change.

12. ICDN and Cache Behaviour

Healthy

What this signal assesses

CDN configuration and cache header correctness determine how efficiently content is delivered to users and crawlers globally. Misconfigured cache headers can cause stale content to be served to crawlers, delaying the pickup of new pages and updated content. This signal checks that the CDN is functioning correctly and that cache headers are appropriate for each content type.

Findings

CDN is correctly configured via Webflow’s built-in CDN infrastructure. Cache headers on HTML pages are set to cache-control: public, max-age=0, must-revalidate — appropriate for dynamic content that may be updated at any time. Static assets (images, CSS, JS) use appropriate long-duration cache headers. No stale content issues were detected.

L02 Signal 12 — Evidence #1
Content type Cache-Control header Appropriate? Assessment
HTML pages (all page types) public, max-age=0, must-revalidate Yes — revalidated on every request Correct
Images public, max-age=31536000, immutable Yes — long-duration appropriate for static assets Correct
CSS files public, max-age=31536000, immutable Yes Correct
JavaScript files public, max-age=31536000, immutable Yes Correct

Cache header configuration — content types

RECOMMENDATION

No action required. The must-revalidate directive on HTML pages ensures Googlebot always receives the current version of any updated page on the next crawl visit.

13. Geo-Delivery Consistency

Healthy

What this signal assesses

Some sites behave differently depending on the geographic location of the visitor or crawler — serving different content, imposing geo-restrictions, or redirecting to locale-specific pages. If Googlebot’s US crawlers see different content than other crawlers, or if the site geo-blocks certain regions, indexation gaps can emerge that are invisible from within the primary market.

Findings

Content delivery is consistent across the US, UK, and EU test locations used in this diagnostic. No geo-restrictions, geo-redirects, or content variations based on IP origin were detected. All pages return the same content regardless of the request origin geography. This is relevant to note because the Series B investor commitments to UK and Canadian expansion mean Onelo will need to build geo-specific content in future — and the current infrastructure confirms there is no technical barrier to doing so.

L02 Signal 13 — Evidence #1
Test location Content returned Redirected? Geo-blocked? Assessment
US (primary market) Identical content across all pages No No Consistent
United Kingdom Identical content across all pages No No Consistent
European Union Identical content across all pages No No Consistent

Geo-delivery consistency test — US, UK, and EU

Forward-looking note: when UK and Canadian market content is built (Component 10 — Expansion), hreflang tags will need to be implemented to signal locale-specific content to search engines. The current geo-delivery infrastructure confirms this will be a configuration addition, not a structural rebuild.

RECOMMENDATION

No current action required. When building international content for the UK and Canadian expansion, implement hreflang tags from the first publication of locale-specific pages — do not add them retroactively after a large content build.

14. Sitemap Health

Healthy

What this signal assesses

Sitemaps communicate to search engines which pages exist on the site and when they were last updated. A sitemap with errors, missing pages, or outdated lastmod dates reduces the efficiency of crawl prioritisation. This signal assesses whether Onelo’s sitemaps are accurate, complete, and correctly structured.

Findings

Three sitemaps are submitted to Google Search Console via a sitemap index file: a main pages sitemap, a blog sitemap, and an images sitemap. All submitted URLs are indexable and return 200 status codes. The sitemap index is correctly formatted. Lastmod dates are present and accurate. The only issue — the four staging page URLs — has been identified and the correction is recommended in Signal 05.

L02 Signal 14 — Evidence #1
Sitemap URL count Errors Lastmod dates Assessment
Main pages sitemap ~757 URLs 0 Present and accurate Healthy
Blog sitemap 94 URLs 0 Present and accurate Healthy
Images sitemap Variable 0 Present Healthy
Staging page URLs (to remove) 4 Included in sitemap in error N/A Remove — see Signal 05

RECOMMENDATION

Remove the 4 staging page URLs from the main pages sitemap (actioned in Signal 05). All other sitemap configuration is correct.

15. Index Stability Over Time

Healthy

What this signal assesses

A stable index is a sign that the site is being crawled consistently and that no algorithmic or manual actions are causing pages to drop. Unexplained drops in indexed page count often precede ranking declines and are an early warning signal for accessibility problems, quality issues, or algorithm sensitivity.

Findings

GSC data over the past 12 months shows a stable and gently growing indexed page count, consistent with the publication rate of new blog content. No unexplained drops or spikes. No manual actions. No algorithmically-driven deindexation events detectable in the data. The indexed page count has grown from 783 to 847 over the 12-month period — an 8.2% increase consistent with approximately 6–8 new posts per month.

The paid ads use specific outcome language (‘Cut Onboarding Time by 60%’) that does not appear anywhere on the organic homepage or in any product page meta description. The paid ads also use audience specificity (‘Built for Mid-Market HR’) that is absent from the homepage above-fold. This suggests the paid team has done positioning work — identifying what resonates with the ICP — that has not been applied to the organic infrastructure.

L02 Signal 15 — Evidence #1
Period Indexed pages Change Explanation Assessment
Mar 2024 783 Baseline Stable
Jun 2024 802 +19 ~6–8 new blog posts published Expected growth
Sep 2024 819 +17 ~6–8 new blog posts published Expected growth
Dec 2024 834 +15 ~6–8 new blog posts published Expected growth
Mar 2025 847 +13 ~6–8 new blog posts published Expected growth
12-month change +64 pages (+8.2%) Consistent with 6–8 posts/month cadence Healthy — no anomalies

Index stability — 12-month indexed page count trend

GSC Coverage report data over the past 12 months. A healthy trend shows gradual growth consistent with content publication, with no unexplained drops.

[Link to spreadsheet: GSC — Performance — Search results — compare date range month-over-month for total clicks, impressions, and indexed page count — export monthly data and plot trend to confirm no unexplained drops]

RECOMMENDATION

No action required. Continue monitoring the GSC Coverage report monthly. Any drop of more than 10 pages in a single month without a corresponding content deletion warrants investigation.

16. Organic Ranking Existence

Healthy

What this signal assesses

This is the baseline confirmation signal for the Accessibility component. The presence of organic rankings across a meaningful number of queries confirms that search engines can access, index, and rank the site’s pages. It is a composite validation of all preceding signals — a site with ranking presence has no fundamental accessibility failure.

Findings

Onelo ranks for 1,847 queries across 847 indexed pages. Ranking existence is confirmed. Accessibility is not a constraint on organic performance.

However, this signal is the natural bridge to Component 03 — Category Presence, where the composition of those 1,847 rankings is examined. The fact that rankings exist does not mean the right rankings exist. The accessibility layer confirms that the engine can rank. The Category Presence layer reveals what it is — and is not — ranking for.

L02 Signal 16 — Evidence #1
Metric Value Benchmark (Series B SaaS) Assessment
Total ranking queries 1,847 500+ Confirmed ranking presence
Indexed pages 847 N/A Appropriate for site size
Ratio of ranking queries to indexed pages 2.18 queries per page Above 1.0 Healthy ratio
Manual actions active None None No restrictions
Accessibility barrier to ranking None identified None All 16 signals pass

Organic ranking existence — baseline confirmation

The 1,847 rankings confirm that the accessibility infrastructure is functioning correctly. The diagnostic finding for this component is complete: no accessibility constraint exists. Component 03 — Category Presence will examine whether these rankings are the right rankings for commercial growth.

[Link to spreadsheet: Ahrefs Site Explorer — Organic Keywords report — export all ranking queries — confirm total count — the composition of these rankings (branded vs non-branded, intent distribution) is analysed in Components 03 and 04]

RECOMMENDATION

No action required. Ranking existence will be tracked as part of the standard monthly KPI review. Any significant decline in total indexed pages or total ranking queries warrants an Accessibility re-assessment.