Technical SEO Audit Checklist: What We Examine on Every Client Site

Written by Gabriel Bertolo
March 3, 2026
image of data to illustrate technical SEO audit

96% of web pages get ZERO traffic. 

The difference between the 96% that get no traffic and the 4% that get traffic usually has to do with technical SEO. 

Technical SEO refers to optimizing a website’s infrastructure to improve how search engines crawl, render, index, and rank pages. It focuses on machine accessibility rather than content quality or backlinks.

Technical SEO isn’t about pleasing Google’s crawler bots; it’s about making sure that there are no technical issues preventing search engines from accessing your website. 

For example, when your website doesn’t pass Core Web Vitals, you don’t just rank lower; you lose up to 60% of your visitors who won’t wait for the page to load. 

When crawlers can’t index your content, your best pages can be invisible to Google. 

Unfortunately, most businesses don’t know their website has technical issues until search traffic plummets. By the time you realize there’s a problem, you could have lost months of potential revenue. A comprehensive Technical SEO audit can catch these issues before they affect your revenue. 

At Radiant Elephant, we take technical SEO audits seriously. Our technical audits go deep. We don’t just run a screaming frog report and call it a day. Because often there are issues that even the best automated tools don’t catch. 

We examine over 100 technical factors, and every issue is rated by priority, severity, and potential revenue impact.  

This article will break down what a professional technical SEO audit covers, why each element matters for ranking, and for UX. The tools and processes we use to identify issues. The most common technical SEO issues we see. And how technical SEO affects your business. 

By the end of the article, you’ll understand why technical SEO is such an important foundation. 

When I audit a new client’s website, I often find 30-50 technical issues within the first hour. And this is the average, whether the site was built by a freelancer or a huge national agency. 

 

Category 1: Core Web Vitals & Page Speed – The User Experience Foundation

Only 54.6% of websites pass all three Core Web Vitals assessments. Source: SE Ranking

And this number gets worse when we look at the mobile device score. Mobile visits account for over 60% of site visits. Yet only 43.4% of websites pass the mobile Core Web Vitals test. 

 

Largest Contentful Paint (LCP) – Loading Speed

What it measures: Largest Contentful Paint (LCP), which measures the time until the largest visible element loads. This is often the main hero section. The target for this should be under 2.5 seconds. 

Why it matters: Users judge site speed by how long it takes for them to see the primary content, not when the page is fully loaded. A slow LCP leads to high bounce rates. People don’t have the patience to wait. If your hero isn’t loading fast, they will leave your site. 

Common issues affecting the LCP score:

  • Unoptimized images
  • Not implementing lazy loading images
  • Render-blocking JavaScript and CSS
  • Slow server response time due to cheap hosting
  • No Content Delivery Network (CDN) like Cloudflare

Business Impact: Sites that take over 2 seconds to load lose 60% of website visitors. Let’s say your site gets 10k monthly visitors, and a slow LCP means that you are losing 6k potential customers every month. Before they even see your messaging and value proposition. 

Tools we use to test LCP: Chrome DevTools, Google PageSpeed Insights, GTMetrix, and testing on real devices and connections. 

 

Interaction to Next Paint (INP) – Responsiveness

Interaction to Next Paint (INP) measures how quickly a webpage responds to user interactions. Google defines INP as the latency between a user action, such as a click, tap, or keypress, and the next visual update on the screen. 

What It Measures: How quickly the site responds to user interactions (clicks, taps, keyboard input). Target: under 200 milliseconds. 

Why It Matters: Unresponsive sites feel broken. Users click buttons that don’t respond, forms that lag, and menus that freeze. This results in user frustration and site abandonment.

Common Issues affecting INP:

  • Heavy JavaScript execution blocking main thread
  • Third-party scripts (chat widgets, analytics) that cause delays
  • Unoptimized event handlers
  • Long tasks that block interactions

Business Impact: E-commerce websites with low INP scores lose sales at checkout, and for service businesses, lagging forms lose potential leads. Every millisecond matters. 

 

Cumulative Layout Shift (CLS) – Visual Stability

Cumulative Layout Shift, or CLS, measures the visual stability of a webpage during loading. Google defines CLS as the total of unexpected layout shifts that occur when visible elements change position between rendered frames.

What It Measures: How much the content shifts around while loading. Target: under 0.1.

Why It Matters: Have you ever been on a website, and you go to click a button, and then the page shifts, and instead you click an ad? This is poor CLS in action. A high CLS means poor UX. 

Common Issues affecting CLS:

  • Images without width/height attributes
  • Ads, embeds, and iframes without reserved space
  • Web fonts that cause the text to shift 
  • Dynamically injected content that pushes existing content down

Business Impact: A bad CLS score negatively affects conversions and your overall conversion rate. If a user has to chase a button or a form, all trust evaporates. 

68% of WordPress sites fail CLS. The most common culprit stems from poorly implemented themes and plugins.

 

Category 2: Crawlability & Indexability – Can Google Actually See Your Site?

Your site could be amazing, have the best content, the strongest value proposition, powerful messaging, and bulletproof on-page SEO, but if search engines can’t crawl and index your website, it won’t show up in search. 

52% of sites misconfigure robots.txt files, accidentally blocking important sections. You might be blocking your own moneymaking pages without knowing it.

Robots.txt Configuration

A robots.txt file is a plain text file that instructs search engine crawlers which URLs they can or cannot access on a website. Website owners place the file in the root directory to control crawl behavior through user-agent and disallow directives. Robots.txt manages crawl budget and prevents indexing of non-public or duplicate sections.

What We Check:

  • That a robots.txt exists and is accessible
  • That it is not blocking important pages/sections
  • Crawl-delay directives 
  • Sitemap location declared
  • Not accidentally blocking CSS/JS

Common Robots.txt errors: Inexperienced web design agencies that accidentally block entire sites with “Disallow: /” in robots.txt. Sites blocking /wp-admin/ AND /wp-content/ (blocks all images/CSS). Blocking JavaScript preventing Google from rendering the site properly.

Business Impact: If pages are being blocked by robots.txt, they won’t rank. Those pages are completely invisible to search engines. 

How We Test: We manually review the robots.txt file. Use Google Search Console’s robots.txt tester. Cross-reference with critical pages to ensure nothing important is blocked. 

Unfortunately, I’ve seen this happen more than once. A client comes to me saying they had their website redesigned, and traffic is plummeting. While analyzing it, I see that the ENTIRE website is designated disallow. This is a rookie move, but it happens more often than you would think. 

 

XML Sitemaps

An XML sitemap is a structured file that lists a website’s important URLs to help search engines discover and crawl content efficiently. Website owners submit XML sitemaps through tools such as Google Search Console to signal canonical pages, update frequency, and last modification dates. XML sitemaps improve crawl coverage and support faster indexing of new or updated pages.

What We Examine:

  • Make sure a sitemap exists and is submitted to Google Search Console
  • Check that the sitemap includes all important pages
  • Doesn’t include blocked/noindexed pages
  • No redirect URLs in the sitemap 
  • Under 50MB/50,000 URLs per sitemap
  • Properly formatted XML file

Common Sitemap Issues:

  • Sitemaps including URLs that 404, redirect, or noindexed pages
  • Sitemap is missing from the robots.txt reference
  • Not submitted to Google Search Console
  • Sitemap includes low-value pages (tag archives, author pages)
  • Missing high-value pages

Business Impact: Your sitemap tells Google what pages and posts are important. If your priority pages aren’t in the sitemap, you’re missing out. 

 

Indexability Issues

Website indexability refers to a search engine’s ability to analyze, process, and store a webpage in its index. A page becomes indexable when it returns a 200 status code, allows crawling, and avoids noindex directives or canonical conflicts. Proper indexability ensures search engines can rank the page for relevant queries.

What We Audit:

  • Pages intended to rank aren’t noindexed
  • Canonical tags pointing to the correct versions
  • No indexation conflicts (canonical vs noindex)
  • No orphaned pages (important content not linked from anywhere)
  • URL structure is clean and logical
  • No excessive URL parameters creating duplicate content

Common Indexing Problems:

  • Staging/noindex tags left on production site (a complete disaster)
  • Canonical tags pointing to the wrong pages
  • Important pages set to noindex accidentally
  • Infinite scroll/load-more pagination is not crawlable
  • Important content behind login walls
  • Content in iframes

At least 3 times, I’ve audited sites where the development team left noindex tags from the staging site on the production site. The sites had been live for months, getting zero organic traffic. The owners couldn’t figure out why their traffic disappeared after the redesign. One meta tag cost them hundreds of thousands in lost revenue.

Tools we use to analyze indexing issues: Screaming Frog crawl, Search Console coverage report, manual review of critical pages, and indexation spot-checks.

 

 

Category 3: Site Architecture & Internal Linking – How Information Flows

Site architecture and internal linking define how you structure and connect pages within a website. A clear architecture organizes content into logical categories and subcategories, while internal links distribute authority and guide crawlers to important pages. This structure improves crawl efficiency, topical relevance, and ranking signal consolidation.

 

Site Depth & Click Distance

Site depth and click distance measure how many clicks a user or crawler needs to reach a page from the homepage. A shallow site depth keeps important pages within three clicks, which improves crawl efficiency and link equity distribution. Reduced click distance increases page discovery speed and strengthens ranking signals.

What We Measure:

  • How many clicks from the homepage to important pages
  • Orphaned pages (no internal links pointing to them)
  • Site hierarchy depth
  • Navigation structure clarity

Important pages should be 3 clicks or fewer from the homepage. Every click further from the homepage = less crawl priority = weaker rankings.

Common Architecture Mistakes:

  • Blog posts buried 5+ clicks deep
  • Product/service pages that are only accessible through multiple navigation layers
  • Important landing pages are not linked from the main navigation
  • No internal linking between related content pieces

Business Impact: Pages with a high click distance rarely rank well, even if the content is great. To Google, the further a page is from home, the less important it is. 

How We Analyze: Screaming Frog site crawl, analyze click depth reports, review navigation structure, and check for orphaned pages.

 

Internal Linking Strategy

Internal linking connects pages within the same domain through hyperlinks. These links help search engines discover content, understand topical relationships, and distribute PageRank across URLs. Strategic anchor text clarifies entity context and strengthens semantic relevance. Effective internal linking improves crawl paths, indexation rates, and ranking signal consolidation.

What We Audit:

  • Anchor text distribution
  • Link equity flow to priority pages
  • Related content linking
  • Breadcrumb implementation
  • Footer/sidebar link optimization
  • Broken internal links

Internal Linking is one of the most powerful on-page SEO tactics. Strategic internal linking distributes PageRank throughout the site, helping all pages rank better. Poor linking starves important pages of link equity. 

Common Internal Link Issues:

  • No contextual internal links within content
  • All links use generic “click here” or “read more” anchors
  • Important pages have 2-3 internal links, while low-priority pages have 50+
  • Broken internal links create dead ends
  • No logical flow between related topics

52% of websites have broken links. These create a poor user experience, waste crawl budget, and damage trust.

Internal Linking Best Practices: We utilize proven and tested internal linking strategies. We design internal linking maps showing how authority flows through a site. Priority pages get links from high-authority, topically relevant pages with optimized anchor text. We internal link content clusters to create topical authority. We don’t just add more links. We add strategic links that power up the entire site. 

Crawl Budget Efficiency: Crawl budget isn’t an issue for small sites. But for a large website, it can become a real challenge. You don’t want to waste your crawl budget on low-value pages. For large sites struggling with crawl budget, we restructure to direct the crawl budget to pages that drive your revenue. 

 

 

Category 4: Mobile Optimization – Where 60%+ of Your Traffic Comes From

Mobile optimization ensures a website functions correctly and efficiently on smartphones and tablets. Google uses mobile-first indexing, which means it primarily evaluates the mobile version of a page for ranking. You improve mobile optimization by using responsive design, optimizing viewport settings, compressing media files, and ensuring readable text without zoom. Proper mobile optimization increases usability, crawlability, and Core Web Vitals performance on smaller screens.

Google uses mobile-first indexing. If your mobile experience is bad, your rankings will be bad as well. 

Mobile-Friendly Test

The Mobile-Friendly Test is a tool that evaluates whether a webpage meets Google’s mobile usability standards. The test analyzes responsive design, viewport configuration, text readability, tap target spacing, and content width. It reports rendering issues that affect mobile-first indexing and user experience. You use the results to identify layout errors, blocked resources, and usability problems that impact rankings on mobile search.

What We Verify:

  • Site passes Google Mobile-Friendly Test
  • Responsive design is properly implemented
  • No mobile-specific errors in Search Console
  • Content identical on mobile/desktop (Google indexes mobile version)
  • No intrusive interstitials on mobile

Common Critical Issues:

  • Text too small to read (common on “responsive” sites that aren’t truly responsive)
  • Click targets too close together (44×44 pixels minimum)
  • Horizontal scrolling required
  • Mobile content is hidden/different from desktop
  • Flash or other unsupported technologies
  • Viewport not configured properly

Business Impact: Mobile-friendly websites are 67% more likely to rank on the first page of Google. Competitors with mobile-first or responsive websites will outrank you even if your content quality and SEO are better. As of July 2024, Google stopped indexing websites that aren’t responsive. So if your website isn’t mobile-friendly, you may be completely excluded from search. 

 

Mobile Page Speed

Mobile page speed measures how quickly a webpage loads and becomes interactive on mobile devices. Google evaluates mobile speed using metrics such as Largest Contentful Paint, Interaction to Next Paint, and Cumulative Layout Shift. Slow mobile performance increases bounce rates and reduces conversion rates. You improve mobile page speed by compressing images, minimizing JavaScript, enabling browser caching, and using a content delivery network.

Mobile-Specific Performance Checks:

  • LCP under 2.5s on 3G connection
  • Render-blocking resources eliminated
  • Images optimized for mobile (smaller file sizes)
  • Mobile-specific JavaScript optimized
  • Touch delay eliminated (300ms click delay)

Why Mobile Speed Is Harder: Mobile devices tend to deal with slower connections like 3G/4G vs. WiFi. They also have far less processing power. 

According to DesignRush, 53% of mobile users abandon sites that take over 3 seconds to load. If your mobile version takes 5 seconds to load, you’re losing more than half of your traffic. 

Mobile Performance Testing: We test on Google PageSpeed Insights and GTMetrix. We also test on real devices on real networks. Simulations are great, but sometimes they miss things. 

 

Mobile User Experience

Mobile user experience refers to how effectively and efficiently users interact with a website on mobile devices. It evaluates usability factors such as responsive layout, readable typography, tap target size, navigation clarity, and load speed. Strong mobile user experience reduces bounce rates, increases session duration, and improves conversion rates. You enhance mobile user experience by optimizing design, performance, and accessibility for smaller screens.

UX Elements We Test:

  • Navigation works on touch devices
  • Forms are easy to complete on mobile
  • Pop-ups don’t cover the entire screen
  • Content is readable without zooming
  • Buttons large enough for thumbs
  • No hover-dependent functionality

Common Mobile UX Disasters:

  • Mega-menus that look cool on a desktop but don’t work on touch
  • Pop-ups covering the full screen with a tiny X button
  • Forms with tiny input fields
  • “Call Now” buttons that don’t actually call
  • Content requiring pinch-zoom to read

 

Category 5: On-Page Technical Elements – The SEO Fundamentals

On-page technical elements are HTML and code-level components that help search engines interpret and index a webpage. These elements include title tags, meta descriptions, header tags, canonical tags, structured data, image alt attributes, and status codes. Proper implementation clarifies topical relevance, prevents duplication issues, and strengthens crawl efficiency. Optimized on-page technical elements improve search visibility and support accurate ranking signals.

Title Tags

Title tags are HTML elements that define a webpage’s title and appear in search engine results as the clickable headline. Search engines use title tags to understand a page’s topic and ranking relevance. 

What We Audit:

  • Every page has a unique title
  • Titles under 60 characters (avoid truncation)
  • Target keyword included naturally
  • Brand name placement (end of title typically)
  • No duplicate titles across the site

54% of websites use duplicate title tags. When this happens, Google has to guess what the page is about. And that is never a good thing. 

This is endemic. I recently worked on an SEO project for a large national brand that had its website redesigned by a leading national agency. The site cost them well over $30,000, and guess what? There were hundreds of pages with duplicate meta titles AND meta descriptions. 

Common Title Problems:

  • Same title on 50+ pages 
  • Keyword stuffing Meta Titles
  • Missing titles entirely
  • Truncated titles (Google shows […] cutting off important info)
  • ALL CAPS titles (looks like spam)

Business Impact: Title tag is the single most important on-page ranking factor. Leavingit empty or not optimizing it properly has serious effects on search rank. 

Title Optimization Strategy: Title optimization is a case-by-case basis. Typically, it’s the page’s primary keyword that aligns with the H1, followed by a secondary keyword and then the brand name. It’s a balance. Titles need to rank, but also drive clicks. 

Meta Descriptions

Meta descriptions are HTML meta tags that summarize a webpage’s content for search engine results pages. Search engines display meta descriptions below the title tag to inform users about page relevance.

Quality Checks:

  • Unique descriptions per page
  • 150-160 characters optimal length
  • Includes target keywords (bolded in SERPs)
  • Written to drive clicks with a solid C2A
  • Featured differentiators that help it stand out in the SERPs
  • Not keyword-stuffed

50% of websites have duplicate meta descriptions. This leads to a low click-through rate. 

Why Descriptions Matter: Your meta description is your elevator pitch in the SERPs. The user is looking at a whole page of them. You have to stand out. A good description leads to higher CTRs and traffic. 

Header Tag Hierarchy

Header tag hierarchy defines the structured use of HTML heading elements from H1 to H6 to organize webpage content. Search engines analyze header tags to understand topical structure and semantic relationships.

Hierarchy Audit:

  • One H1 per page (contains primary keyword)
  • Logical H2-H6 structure
  • No skipped levels (H2 to H4 without H3)
  • Headings accurately describe sections
  • Natural integration of keywords in headings

Why Structure Matters: For Google, headers are used to understand the topic of a page and all subtopics. For users, people tend to skim, so a well-thought-out heading strategy helps the user find the unfo they need. 

Common Header Mistakes:

  • Multiple H1s (dilutes focus)
  • No H1 at all
  • Headers used for styling instead of structure
  • No headers in long content blocks
  • Keyword stuffing in every header
  • Random header levels (H2, H4, H2, H5, H3 chaos)

Nearly 100% of sites on page 1 use a target keyword in the H1. 

 

Image Optimization

Image optimization improves how images load, render, and communicate context to search engines. You optimize images by compressing file sizes, using modern formats such as WebP, defining width and height attributes, and implementing descriptive alt text. 

Image SEO Checklist:

  • Alt text on all images (descriptive, includes keywords where relevant)
  • Descriptive filenames (not IMG_1234.jpg)
  • Compressed/optimized file size
  • Proper dimensions (not massive images scaled down by CSS)
  • Next-gen formats (WebP) 
  • Lazy loading implemented

Stats That Matter:

  • Image search accounts for 22.6% of all Google queries
  • Websites with descriptive alt tags see 23% boost in organic traffic
  • Only 26% of websites use alt text properly
  • 36% of websites have oversized images, killing page speed

Business Impact: Empty or bad alt text causes accessibility issues and leaves ranking potential on the table. Images that are too large and not optimized lead to slow-loading sites, which lowers rankings and traffic. 

Image Optimization Process: We typically use Imagify for image compression and to convert to WebP files. Our WordPress framework includes lazy loading and alt text. 

 

 

Category 6: Schema Markup & Structured Data – Helping Google Understand Your Content

Schema markup and structured data provide standardized code that helps search engines understand entity relationships on a webpage. You implement structured data using formats such as JSON-LD to define elements like articles, products, reviews, and organizations. Search engines use this data to generate rich results, enhance visibility, and clarify contextual relevance. Proper schema markup strengthens semantic signals and improves click-through rates.

72.6% of the results on the first page of search have Schema Markup. 

 

What Schema Markup Does

Schema Markup is powerful when done properly. Instead of Google Bots having to crawl the entire page, Schema provides a snapshot to Google. Schema helps reinforce aspects like location, awards, EEAT, and keywords. Schema also helps prevent entity ambiguity. Custom Schema will link to your Google Maps, your most authoritative citations, social media, and to the Wiki pages that define the exact services you offer. 

The Problem It Solves: I use the example of fencing, let’s say a company offers lessons for the sport of fencing. The word fencing can mean the yard barrier, the sport, or the act of selling stolen goods. Schema removes all ambiguity by linking to the Wiki entity for Fencing, the sport. 

How It Works: Structured Data Markup gives explicit labels and defines entities. It offers complete clarity to search engines. 

 

Rich Results It Enables:

  • Star ratings in search results
  • Product prices and availability
  • Recipe cooking times and ratings
  • Event dates and locations
  • FAQ dropdowns
  • Video thumbnails
  • Breadcrumb trails
  • Local business info (hours, phone, address)

Schema Stats That Matter:

  • Websites with structured data are 58% more likely to earn rich snippets
  • Rich snippets improve CTR by 20-35%
  • Users click rich results 58% of the time vs. 41% for standard results
  • FAQ rich results have 87% CTR

Business Impact: You and a competitor are ranking #3 and #4. One has a rich snippet with stars and pricing, and the other has the plain blue link. Which gets more clicks? Rich snippets do. 

Real Example: A client was ranking in the middle of page one. We added LocalBusiness schema with the ratings, service area, etc. The CTR went from 3.7% to 10.3% in a few days. Same position, but now it gets 3 times as many clicks. 

 

Schema Types We Implement

Schema types are structured data categories defined by Schema.org that specify the entity a webpage represents. Each type clarifies attributes, properties, and relationships for search engines. Common schema types include Article, Product, Organization, FAQPage, and LocalBusiness.

Priority Schema Types

Organization/LocalBusiness: Company name, logo, contact info, address, hours, social profiles, citations. Critical for local businesses. This schema can get really intense when we add knowsabout, mentions, and sameas sections. 

Article/BlogPosting: Author, publish date, modified date, headline, article headings, featured image, and the article copy. Article schema powers Google News, Discover, and Google AI citations. 

Product: Price, availability, ratings, reviews. Essential for e-commerce. Products with a complete schema are 4.2X more likely to appear in Google Shopping. 

Service: Service descriptions, areas served, pricing (range or actual). Critical for service businesses. Service schema allows us to precisely define our services.  

FAQ: Questions and answers. Creates expandable FAQ sections in search results. 

Review/AggregateRating: Star ratings aggregated from reviews. Highly visible in search results.

Breadcrumb: Navigation path. Shows site hierarchy in search results. 

HowTo: Step-by-step instructions. Shows as rich cards for instructional content.

Event: Date, time, location, ticket info. Powers Google event search features.

Video: Upload date, description, thumbnail. Required for video-rich results.

Why Multiple Schema Types: Comprehensive markup creates a semantic web of entities and relationships. Google understands not just individual pages but how your entire site connects. 

Implementation Method: We use JSON-LD (recommended by Google), not Microdata or RDFa. Cleaner, easier to maintain, doesn’t clutter HTML. 

 

Schema Validation & Testing

Schema validation and testing verify that structured data follows Schema.org guidelines and search engine requirements. You use tools such as Google Rich Results Test and Schema Markup Validator to detect syntax errors, missing properties, and eligibility issues.

How We Validate:

  • Google Rich Results Test for each schema type
  • Schema Markup Validator for syntax errors
  • Search Console Enhancements report for warnings/errors
  • Manual SERP checks to confirm rich results displaying

Common Schema Errors:

  • Missing required properties
  • Incorrect data types 
  • Mismatched information 
  • Deprecated schema types are still implemented
  • Invalid JSON syntax

Maintenance: Schema is not a set it and forget it kind of thing. Schema should be updated when things change, like prices, services, contact info, URLs, headlines, and business information. An outdated schema can trigger a loss of rich results, leading to a loss of clicks. 

 

 

Category 7: Security & HTTPS – Trust Signals and Ranking Factors

Security and HTTPS protect data transferred between a user’s browser and a web server. HTTPS uses SSL or TLS encryption to secure communication and prevent interception or tampering. Search engines treat HTTPS as a ranking signal and label non-secure HTTP pages as unsafe. You implement HTTPS by installing an SSL certificate, redirecting HTTP to HTTPS, and updating internal links. Proper security increases user trust, protects sensitive data, and supports search visibility.

95% of Google’s top results use HTTPS. Non-secure sites have 50% higher bounce rates. Browsers label HTTP sites as “Not Secure.”

 

SSL Certificate & HTTPS Implementation

An SSL certificate authenticates a website’s identity and enables HTTPS encryption through the TLS protocol. The certificate encrypts data exchanged between the browser and the server, which protects login credentials, payment data, and personal information.

What We Verify:

  • Valid SSL certificate installed
  • All pages load via HTTPS
  • No mixed content warnings (HTTP resources on HTTPS pages)
  • HTTP to HTTPS redirects are working
  • Certificate not expired
  • Certificate covers all subdomains needed

Common HTTPS Problems:

  • Some pages are still serving HTTP
  • Images/CSS/JavaScript loading via HTTP (mixed content)
  • Canonical tags pointing to HTTP versions
  • Internal links pointing to HTTP
  • Sitemaps referencing HTTP URLs
  • Expired SSL certificates

Security Indicators: Browsers show a padlock icon for HTTPS. When there is no padlock, there is a warning that the website is not secure. This warning kills trust and leads to very high bounce rates. 

Business Impact: A warning stating the website is not secure leads to instant credibility loss. Many users won’t enter any information on a non-HTTPS site. Google uses HTTPS as a ranking factor, so non-secure sites also rank much lower. 

Migration Dangers: When migrating from HTTP to HTTPS, there are a lot of aspects that need to be updated. You need to make sure all elements are served via HTTPS (like images). You also must update internal links, canonical tags, and 301 redirects. 

Security Headers & Best Practices

Security headers are HTTP response directives that protect websites from common vulnerabilities and attacks. The server sends these headers to instruct the browser how to handle content securely. Key security headers include Content-Security-Policy, Strict-Transport-Security, X-Content-Type-Options, and X-Frame-Options. You configure security headers to prevent cross-site scripting, clickjacking, MIME sniffing, and protocol downgrade attacks. Proper implementation strengthens website security and protects user data.

Advanced Security Checks:

  • Content Security Policy (CSP) header
  • X-Frame-Options (prevents clickjacking)
  • X-Content-Type-Options (prevents MIME sniffing)
  • Referrer-Policy configured
  • HTTP Strict Transport Security (HSTS)

Why These Matter: Security headers protect against common attacks. While not direct ranking factors, they build trust and prevent security incidents that would destroy rankings (hacked site = Google removes from index).

Backup & Recovery:

  • Daily automated backups
  • Tested restore procedures
  • Offsite backup storage
  • Version control for code

Security Best Practices:

  • WordPress/plugins kept updated (90% of WordPress hacks exploit outdated plugins)
  • Strong password policies
  • Two-factor authentication
  • Limited user permissions
  • Regular security scanning

The Hacking Disaster: A hacked site can get de-indexed or load with a warning that the site may be hacked. When this happens, traffic can drop to zero overnight. Recovery can take weeks after removing all malware. Prevention costs much less than recovery. 

 

 

Our Technical Audit Process & Tools

We have a very in-depth technical audit that utilizes several different tools as well as manual reviews. We typically start with a full audit using either Screaming Frog or SiteBulb for the basics. This often gives us insights into other aspects we need to look into. Then we add the site to Ahrefs, run a health check, and set up health monitoring. Then we go into Google Search Console and check indexation status, check for manual actions, etc. 

Tools We Use

We use a range of tools when we perform a technical audit. 

Crawling & Analysis:

  • Screaming Frog SEO Spider: Full site crawl identifying technical issues. Analyze title tags, meta descriptions, headers, response codes, redirect chains, broken links, page depth, and duplicate content.
  • Ahrefs Site Audit: Automated technical health monitoring, 100+ technical checks, crawl budget analysis.
  • Google Search Console: Indexation status, coverage errors, mobile usability issues, Core Web Vitals data, manual actions, and security issues.

Performance Testing:

  • PageSpeed Insights: Google’s official performance tool, real user data from Chrome UX Report.
  • GTmetrix: Detailed waterfall analysis, historical performance tracking.
  • WebPageTest: Test from multiple locations/devices/connections, filmstrip view of loading.

Mobile Testing:

  • Google Mobile-Friendly Test: Official mobile compatibility check.
  • BrowserStack: Test on real devices (50+ phones/tablets).
  • Chrome DevTools Device Mode: Initial responsive testing.

Schema Validation:

  • Google Rich Results Test: Validates schema markup, shows a preview of rich results.
  • Schema Markup Validator: Checks JSON-LD syntax and completeness.

Security Testing:

  • SSL Labs: Comprehensive SSL/TLS configuration testing.
  • SecurityHeaders.com: Security header analysis.
  • Sucuri SiteCheck: Malware and vulnerability scanning.

Why Multiple Tools: No single tool catches everything. Screaming Frog might miss issues that Search Console shows. PageSpeed Insights provides different data than GTmetrix. Comprehensive audits use 10-15 tools, cross-validating findings.

 

Our Audit Methodology

Phase 1: Automated Crawl (Day 1) Run Screaming Frog full crawl. Export data. Run Ahrefs site audit. Review Search Console data (requires access).

Phase 2: Performance Analysis (Day 1-2) Test top 20 pages with PageSpeed Insights, GTmetrix, WebPageTest. Identify performance patterns. Document Core Web Vitals issues.

Phase 3: Manual Review (Day 2-3) Check robots.txt, sitemap, critical pages. Review indexation coverage. Test mobile experience on real devices. Validate schema markup. Check security headers.

Phase 4: Competitive Comparison (Day 3) Audit the top 3 competitors’ technical SEO. Identify gaps and opportunities. See what they’re doing right/wrong.

Phase 5: Prioritization & Reporting (Day 4-5) Categorize issues by severity (Critical/High/Medium/Low). Estimate business impact. Create an action plan with a timeline. Write a comprehensive report with before/after projections.

Deliverable: 20-50 page audit report with:

  • Executive summary (business impact)
  • Issue severity breakdown
  • Category-by-category findings
  • Competitor comparison
  • Prioritized action plan
  • Effort estimates
  • Expected traffic/ranking impact

Why This Depth: Often, less experienced SEO’s will use one tool, export the report, and call it a day. That’s not an audit, it’s a data dump. We go through these lengths so we know all the issues that exist and understand why they exist, and how they impact your business. This allows us to strategize what to fix first to generate the fastest results. Our technical audits are strategic documents, not a vague problem list.

If you’re experiencing traffic drops or your website is not ranking well overall, a technical audit might be just what you need. Click here to fill out our contact form and get started. 

Gabriel Bertolo - Founder of Radiant Elephant

Gabriel Bertolo

Gabriel Bertolo is a 3rd generation entrepreneur who founded Radiant Elephant over 13 years ago after working for various advertising and marketing agencies. 

He is also an award-winning Jazz/Funk drummer and composer, as well as a visual artist.

His Web Design, SEO, and Marketing insights have been quoted in Forbes, Business Insider, Hubspot, Entrepreneur, Shopify, MECLABS, and more.

Check out some publications he's been quoted in:

Quoted in HubSpot's AI Search Visibility Article and HubSpot's Article on 6 Best Wix Alternatives

Quoted in DesignRush Dental Marketing Guide 

Quoted in MECLABS 

Quoted in DataBox Website Optimization Article and DataBox Best SEO Blogs

Quoted in Seoptimer

Quoted in Shopify Blog