How to Audit a Website: A Practical Guide

TL;DR
Your website has probably been live for a few years. It's been updated by different people, migrated at least once, and has plugins or features nobody remembers adding. It works, mostly, but you suspect it could be doing better. A website audit tells you what's actually happening under the hood and more importantly, which problems are worth fixing.
What Is a Website Audit?
A website audit is a systematic review of your site's technical health, performance, and effectiveness. Think of it like a diagnostic check for your car: you're looking for issues that are costing you now, issues that will cost you later, and things that are fine to leave alone. The term covers a lot of ground. Performance audits measure how fast your site loads, SEO audits examine how search engines see your pages, UX audits evaluate how real humans experience your site, and security audits check for vulnerabilities. Most businesses need a combination of this and most audits also will consist of elements from each of these, because these categories overlap in practice. A slow site is both a performance problem and an SEO problem. A confusing checkout flow is both a UX issue and a revenue issue. Understanding these overlaps is key to knowing why websites need audits in the first place.
Why Websites Need Audits
Websites decay. Not dramatically, i.e. they don't crash overnight. They do, however, accumulate small problems that compound into significant drag on your business. Consider what happens over three years of normal operation. A page that loaded in 2 seconds when you launched now takes 4 seconds because of added tracking scripts, larger images, and installed plugins. That extra 2 seconds costs you roughly 7% of conversions, according to Google's research on page speed and user behavior.1 Meanwhile, title tags that were fine when you launched are now generic compared to competitors who've optimized theirs. You're ranking position 8 instead of position 3, which means roughly 75% less organic traffic to that page. And somewhere along the way, a contact form that works perfectly on desktop started breaking on certain mobile browsers. You don't know because you don't use those browsers. Neither does anyone on your team.
None of this shows up as an emergency. The cumulative effect on the other hand, is a site that's working against you instead of with you. An audit surfaces these issues and helps you prioritize which ones actually matter. To understand what an audit looks for, it helps to know how websites actually work under the hood.
The Mechanics of Performance
When someone visits your website, their browser makes a request to your server. The server responds with HTML, which the browser starts parsing. As it parses, it discovers additional resources it needs to request from the server; stylesheets, JavaScript files, images, fonts, etc. Only after enough of these resources have loaded and been processed can the browser render something useful on screen. Performance audits measure how efficiently this process happens, and the metrics that matter most are Google's Core Web Vitals, because Google uses them as ranking factors and because they correlate with user experience.
Beyond Core Web Vitals, server response time (Time to First Byte) tells you whether your hosting infrastructure is a bottleneck, total page weight tells you how much data visitors have to download, and number of requests tells you how many round trips (request-response cycles) the browser needs to make. All of these contribute to the overall experience of "this site feels fast" or "this site feels slow." However, speed is only part of the picture. A fast site that search engines can't find properly is still a site that's underperforming.
How Search Engines See Your Site
Search engine optimisation has a reputation for being mysterious and I would agree. Google is a black box and you do not know how they value what, but the technical side is actually quite straight forward. Search engines send automated crawlers to visit your pages, follow links, and build an index of what exists on the web. When someone searches, the engine consults this index and returns results ranked by relevance and quality. Technical SEO problems interfere with this process at different stages, and understanding where the breakdown occurs determines how you fix it. You hear often about crawlability and indexability2, but I would add a third part: on-page factors.
Crawlability problems prevent search engines from finding your pages in the first place. Broken or inconsistent internal linking means crawlers can't navigate your site effectively, misconfigured robots.txt files might accidentally block pages you want indexed, and orphan pages with no internal links pointing to them often go undiscovered entirely. Indexability problems prevent pages from appearing in search results even after crawlers find them. An accidental noindex tag tells Google not to list a page, duplicate content confuses the engine about which version to show, and canonical tag errors can point Google to the wrong URL. These issues are invisible to regular visitors but completely change how search engines treat your content, which is why they're easy to miss without a proper audit.
Once your pages are crawlable and indexable, on-page factors affect how well you rank. Your title tag is the most important single element. It appears in search results and tells both engines and users what the page is about. A title that says "Home - Company Name" provides no information about what you offer. Headings structure your content and signal what's important. Meta descriptions don't directly affect ranking but do affect whether people click through to your site. Schema markup helps search engines understand your content well enough to display rich results like star ratings, event times, or product prices as well as what the role of the page and your organisation is. Site architecture (how your pages are organized and linked together) affects how authority flows through your site. A page buried four clicks deep from your homepage will generally rank worse than a page one click away, all else being equal. Technical SEO is measurable and mechanical, but user experience requires a different kind of evaluation.
User Experience Beyond the Numbers
Automated tools can measure page speed and flag missing alt tags, but they can't tell you whether your site is actually pleasant to use. That requires human judgment. The most important UX question is whether people can accomplish their goals; if someone visits your site to buy a product, can they find it, understand it, and purchase it without friction? If they want to contact your sales team, is the path obvious? If they need specific information, can they locate it within a few clicks?
Mobile experience deserves special attention because it's where most problems exist. Responsive design that technically works often fails in practice. E.g. forms designed for desktop become frustrating on phones, tap targets that seem fine on a large screen become impossible to hit accurately on a small one, and text sized for monitors requires zooming on mobile devices. The only way to find these issues is to actually use your site on a phone, not a simulation, but a real device held in your hand. Accessibility overlaps with usability here: can someone navigate your site using only a keyboard? Can screen readers interpret your content correctly? Is there sufficient color contrast for people with visual impairments? The Web Content Accessibility Guidelines (WCAG) provide standards, but the underlying principle is simple: your site should work for everyone who wants to use it.
Jakob Nielsen's usability heuristics3 offer a useful framework for evaluation: Does your site provide clear feedback when users take actions? Does it use language your audience understands rather than internal jargon? Can users recover easily from mistakes? Are similar functions consistent throughout? These aren't metrics you can automate, they require walking through your site as a user would and noting where things feel confusing, slow, or broken. The same human judgment applies to security, though the stakes are different.
Security Considerations
Security audits check whether your site is vulnerable to attack, and the basics matter most. Is your SSL certificate valid and properly configured? Is your CMS and all its plugins updated to versions without known vulnerabilities? Are sensitive directories and files protected from public access? Beyond the basics, security headers tell browsers how to handle your content safely. A Content Security Policy restricts what scripts can run on your pages, defending against injection attacks. Missing security headers are low-hanging fruit that many sites neglect.
If you're handling sensitive data, like customer information, payment details, login credentials, the stakes are higher. Form vulnerabilities like SQL injection4 and cross-site scripting (XSS)5 can expose your database or your users to attack, making professional penetration testing worth the investment. But most business websites have more mundane security gaps: outdated WordPress installations, plugins that haven't been updated in years, admin URLs that are exposed and guessable. These are the issues a standard security audit catches. Once you understand what audits measure, the next question is how to actually run one.
Running an Audit: Tools and Process
The good news is that most of the tools you need are free. For performance, Google PageSpeed Insights runs a Lighthouse audit on any URL and shows both lab data (controlled test conditions) and field data (real user experiences from Chrome). GTmetrix and WebPageTest provide detailed waterfall charts showing exactly what's loading and when. For any performance tool, test your homepage, your highest-traffic pages, and any page where conversions happen—and always test mobile, where performance is typically worse.
For SEO, Screaming Frog's SEO Spider is the industry standard. It crawls your site like a search engine would, finding broken links, duplicate content, missing tags, redirect chains, and structural issues. The free version handles sites up to 500 URLs. Google Search Console shows you how Google actually sees your site: which pages are indexed, which have errors, and what search queries bring traffic. Paid tools like Ahrefs and Semrush add competitive analysis and backlink auditing, useful but not essential for a basic technical audit. For UX and accessibility, the WAVE browser extension flags accessibility problems as you browse, and Google's Mobile-Friendly Test checks basic responsive design, but real UX issues require manual review, walking through your key user journeys and noting where things feel wrong. For security, Mozilla Observatory grades your security headers and SSL configuration, OWASP ZAP scans for common web application vulnerabilities, and WPScan checks for known WordPress plugin and theme vulnerabilities if that's your platform.
There are many tools and looking into all of them takes time, that's the reason there are professionals doing these audits. That being said, the process matters as much as the tools. Start by defining what success looks like for your site. What actions do you want visitors to take? Then audit with those goals in mind. A slow page that nobody visits matters less than a moderately slow page that drives sales. A missing meta description on a blog post from 2019 matters less than missing structured data on your product pages. This leads to the hardest part of any audit: deciding what to actually fix.
What Audits Typically Uncover
Patterns emerge across different sites and industries. Performance issues almost always involve images: unoptimised photos often account for 50-80% of total page weight6. Render-blocking resources, where the browser waits for CSS and JavaScript to load before showing content, are nearly universal. Third-party scripts accumulate over time: analytics, chat widgets, marketing tags, social embeds, each adding weight and latency.
SEO issues cluster around metadata and structure. Just to give a few examples of things that I have encountered over the years.
- Title tags are often too generic or duplicated across pages.
- H1 headings are absent or don't describe the page content.
- Meta descriptions provide no reason to click through.
- Internal links are broken or inconsistent.
- Schema markup is missing or incorrectly implemented.
These aren't complex problems. They're the kind of thing that accumulates when sites grow without regular review.
UX issues often trace back to mobile:
- Forms have fields too small to tap accurately.
- Buttons that work with a mouse fail with a finger.
- Text requires zooming.
- Pop-ups are impossible to dismiss on a small screen.
These problems persist because the people building and testing sites typically use desktops. Security issues frequently involve neglected maintenance. The CMS was never updated, plugins have known vulnerabilities that were never addressed, admin URLs are exposed and guessable. HTTPS is enabled but leaking mixed content warnings. None of this is sophisticated, it's just forgotten.
An audit will surface dozens of these issues. You can't fix everything at once, and you shouldn't try. Automated tools are especially prone to overwhelming you. Screaming Frog might flag 200 "issues" on a medium-sized site, but maybe 20 of them actually matter and maybe 5 are urgent. The diagnostic skill isn't just finding problems, it's knowing which problems are actually costing you business.
The Prioritisation Problem
Impact should drive prioritisation. A slow checkout page matters more than a slow About page because one directly affects revenue and the other doesn't. A broken contact form matters more than a missing alt tag because one prevents leads and the other is a minor accessibility issue. Frame every problem in terms of business outcomes: traffic, conversions, revenue, risk. Effort matters too; some fixes take five minutes (compressing an image, adding a missing title tag, fixing a broken redirect) while others require significant development work (rebuilding page templates, refactoring JavaScript, migrating hosting infrastructure). Quick wins deserve priority when their impact is reasonable.
A useful framework: plot issues on a grid with impact on one axis and effort on the other. High impact, low effort issues are your immediate priorities. High impact, high effort ones go on the roadmap. Low impact issues get deprioritised or ignored entirely, regardless of how easy they are to fix. This sounds obvious in theory, but it's where most organisations struggle. The temptation is to fix easy things first, or to fix whatever the loudest tool flagged, or to try to achieve a perfect score on some metric that doesn't actually affect business outcomes. Disciplined prioritisation is what separates audits that produce results from audits that produce reports nobody acts on.
The Methodology in Practice
Last year I ran an SEO audit for Danküchen, Austria's market-leading kitchen manufacturer. What I found wasn't unusual: dozens of small technical issues, none of them urgent on their own, all of them compounding into significant lost visibility. This is the kind of quiet accumulation that happens to every website over time. Pages get added by different people with different standards, redesigns preserve some legacy problems while introducing new ones, teams change and institutional knowledge about site structure fades. Nobody is actually doing anything wrong.
The audit followed the methodology I've described: crawl the site to identify structural and technical issues, benchmark current performance against competitors and best practices, categorize issues by type and severity, then, and this is the part that matters most, prioritise ruthlessly based on which fixes would actually move the needle. The fixes themselves weren't exotic: title tag optimisation, heading structure cleanup, meta description improvements, schema markup additions. What made the difference was knowing where to look and understanding which issues among the hundreds flagged were actually costing visibility.
After implementing the fixes, multiple keywords moved into the top 3, including "küchen österreich" to #1. SEO involves many factors, so I won't claim sole credit, but the results are clear. The takeaway isn't about any specific tactic. It's that the same diagnostic approach applies whether you're auditing for SEO, performance, UX, or security: you're looking for accumulated problems that compound into lost business, then prioritising based on what will actually deliver results.
When to Do It Yourself vs. Hire Help
You can run a basic audit yourself using the tools I've described. This makes sense if you have technical staff who can interpret results, if your site is relatively simple, if you mainly need a health check rather than deep analysis, or if budget is tight. The tools are accessible and the learning curve isn't steep. Professional help makes sense when your site is complex: multiple subdomains, hundreds of pages, custom functionality that interacts in non-obvious ways. It makes sense when you need competitive analysis and strategic recommendations, not just a list of problems. It makes sense when your team doesn't have time to learn the tools and interpret results, or when the audit will inform a major investment like a redesign or platform migration.
Common Failure Modes
Treating the audit as a one-time event is the most common mistake. Websites change. Your hosting changes. Your competitors change. Google's algorithms change7. The landscape that an audit captures is a snapshot, not a permanent record, and without periodic re-auditing or continuous monitoring, problems accumulate again. Over-relying on automated tools leads organisations to chase false positives while missing real issues. A tool might flag an image as "too large" when it's actually appropriately sized for its purpose, while the tool can't tell you that your checkout flow is confusing or that your value proposition is unclear. Human judgment remains essential.
Trying to fix everything, or fixing things in the wrong order, wastes resources on low-impact changes while urgent issues persist. Perfect Lighthouse scores don't matter if your site doesn't convert. Metrics serve business goals, not the other way around. Ignoring mobile despite knowing better happens constantly. Organisations test on desktop because that's what they use, approve designs on desktop because that's how stakeholders review them, then act surprised when mobile users (more than half of typical web traffic)8 have problems. Avoiding these mistakes requires thinking about what happens after the audit is complete.
Beyond the Audit
An audit tells you what's wrong. What happens next determines whether the audit was worth doing. You need a prioritised action list with issues ranked by impact and effort, owners assigned, and deadlines set. You need technical resources to implement fixes; be that your existing developer, your agency, or a specialist. You need monitoring to verify fixes worked and catch new issues before they compound. You need follow-up after 30-60 days to measure results and adjust course.
Sometimes an audit reveals that incremental fixes aren't enough. If the underlying technology is outdated, if the architecture is fundamentally broken, if technical debt has accumulated past the point of reasonable remediation, the efficient path forward might be starting fresh rather than patching indefinitely. Sometimes the audit shows problems that aren't about the website itself. If your team is drowning in manual processes, managing data in spreadsheets that should be in proper systems, the fix is building internal tools that actually support your workflow.
And if you recognize the pattern I described at the start (accumulated small problems, knowing something's wrong but feeling paralyzed about where to start) you're not alone. That paralysis is common. The audit is the first step out of it. Then you prioritise ruthlessly. Then you fix one thing at a time.
Suspect your website has accumulated issues but not sure where to start? An audit surfaces what's actually costing you and what's fine to leave alone. Let's talk about what a review of your site would involve.
Footnotes
-
A 2016 study showing that 53% of mobile users abandon sites taking over 3 seconds to load, with each additional second costing roughly 7% in conversions. Google Mobile Page Speed Study ↩
-
Explanation of how search engines discover pages (crawlability) and decide whether to include them in search results (indexability). Crawlability and Indexability Explained ↩
-
Ten general principles for interaction design, widely used as a framework for evaluating user interfaces. Jakob Nielsen's 10 Usability Heuristics ↩
-
Introduction to SQL injection attacks, where malicious code is inserted into database queries through user input. SQL Injection Basics ↩
-
Overview of cross-site scripting (XSS), where attackers inject malicious scripts into web pages viewed by other users. Cross-Site Scripting Basics ↩
-
Average image size on tested pages came out to around 1MB with total page weights were around 2MB. Page weight ↩
-
Unfortunately without warnings. ↩
-
As of late 2024, mobile devices generate approximately 60-63% of global website traffic, having surpassed desktop in 2016. Regional variation is significant—North America and Germany still see roughly 45-50% desktop traffic, while regions like Africa and Asia exceed 70% mobile. Statista Global Mobile Traffic Share ↩