Search engine optimization (SEO) has always been a moving target for search marketing professionals because search algorithms change frequently, and adapting to these changes remains key to optimizing websites for better rankings and visibility within organic search results.
Google’s latest algorithm updates have shifted the SEO conversation towards finding out how to make it easier for search engines to discover and crawl websites, while also helping crawlers interpret web content so that search algorithms can more accurately match web pages to the increasingly complex and targeted search queries we are speaking and typing into our mobile, desktop, and smart devices. We commonly refer to these methodologies as “Technical” SEO because they focus on strategic code-level improvements of your meta content, page performance, structured data, device compatibility, and digital accessibility.
While Technical SEO leverages many new and emerging trends in web design and development, such as progressive web applications and schema markup, its origins trace back to the history and evolution of search engine optimization over the years since Google became the world’s dominant search engine.
A Brief History of Search Engine Optimization
“Technical SEO” might be a popular term these days, but all of SEO began as a purely technical discipline back in the early days of the web, when “webmasters” tried to manipulate search engine algorithms by tinkering with code and implementing…well, let’s call them questionable strategies, such as keyword stuffing or cloaking. It was the Wild West of the SEO era, back when finding clever ways to “trick” search engine crawler bots could win more keyword queries and improve your website’s ranking on search engine results pages (“SERP“s) with no penalties.
Of course, Google eventually got wise to these kinds of dubious “black hat SEO” tactics and began issuing regular updates to its search algorithms that negated the effects of black hat SEO strategies, forcing digital marketers to shift their approach and drive organic search traffic to websites using effective content marketing strategies. These mostly involved “link building,” or publishing high-quality content with strategically placed keywords to earn social shares and links from other websites. Growing external link networks would pass domain authority back to the content publisher, thereby increasing their website’s rank for targeted keyword search queries.
Naturally, web marketers still found plenty of nefarious ways to “get around” Google’s earliest algorithm updates, such as link schemes that might involve selling backlinks or publishing fake websites that merely served as a collection of backlinks. But Google’s algorithm kept iterating, slowly but surely learning to recognize and mostly ignore black hat SEO tactics with each new update, striving to ultimately reward websites for offering an optimal experience to more visitors.
Now SEO has come full-circle, turning back to best practices established in the old days, when developers coded web pages with one goal in mind: Create the best user experience.
What is Technical SEO?
Technical SEO enhances your content, design, and code structure to help search engines discover, crawl, and index your website pages. It also improves your organic search rankings by optimizing your page performance and user experience to serve website visitors browsing across multiple devices.
We admit that fully understanding Technical SEO requires more than one comprehensive blog post–even if you already have deep knowledge of web development and coding. Regardless, we will cover four major best practices here, each with distinct strategies you can implement to build a Technical SEO foundation for your own website(s):
1. Optimized Content
You’d be mistaken to think technical SEO is unrelated to content marketing SEO. To the contrary–technical SEO supports and enhances your content marketing strategy by making the content you publish more discoverable and crawlable for search engines–which also makes it better for all of your website visitors–so that your published websites have a better chance of being indexed and displayed within search results.
With this in mind, you can still approach technical optimization of your website by first creating high-quality content, because that lays the foundation for effective technical SEO. However, double-checking the following best practices during your creative process will help you technically optimize your content before it is published:
Robust Meta Keywords
Content is typically written according to keyword strategies designed to help it rank for targeted search queries. With that in mind, make sure to also apply your keyword strategy to meta descriptions, page titles, and header tags. Your website CMS should offer an easy way to manage and edit the metadata on each page.
While content strategists have long known that keywords in the body text of web pages are helpful to search engines, applying keyword strategy to meta data helps web crawler bots understand the verbal hierarchy established within your content, providing more context so that algorithms will more confidently recognize and select your website to display as a suitable answer to search queries that include similar contextual keywords.
Keep in mind, keyword strategy itself is evolving today as the way we search is changing with the growing adoption and popularity of voice search. Why? Because voice searches on mobile and connected devices via smart assistants like Siri, Alexa, and Google produce fundamentally different search queries. This makes more sense when you think of the difference between typing keyword searches into Google and verbalizing questions related to that keyword more naturally in spoken conversation.
For example, digital marketers with questions about keywords might simply type “keyword strategy” into Google’s search bar and then scan the results page; whereas if they were speaking to an SEO professional, they’d probably phrase their questions as “How do keywords affect my SEO?” or “How do I build a keyword strategy for my business?” Clearly, the wording used in each query is quite different, even though each question seeks essentially the same information.
This discrepancy is especially important if your content’s goal is to rank specifically for spoken keyword queries, because unlike traditional SEO where your content competes for real estate on a SERP, for voice SEO the only search result that matters is the one spoken back to you. So to increase voice search traffic to your website, make sure the verbiage used in your keyword strategy matches the way people commonly speak the queries you’re targeting.
Related Read: Emerging Trends in Voice Search SEO
Link Building and Enhancing
While content strategy traditionally pursues the acquisition of external links from authoritative domains, technical SEO is more concerned with building internal link equity within your site.
That means your content should include links pointing back to other published pages on your website, because it makes those pages more authoritative in the eyes of crawler bots. For each internal link, it’s also important to apply your keyword strategy to anchor text (i.e. the clickable text that links to another page).
Of course, any linking strategy becomes worthless if the links don’t work. That’s why you should regularly perform internal link audits to identify and remove broken links and bad redirects that lead to error pages or redirect chains. Search engine bots often penalize (or worse, ignore) pages that have too many of these link errors.
Using Descriptive + Responsive Images
As always, any visual media placed within your content should be high-quality, meaning it has high resolution, good composition, appropriate context, and (ideally) original ownership or creator consent/licensing. But beyond how your images look, there are tweaks you can make to optimize them for better technical SEO:
- Filenames: First, before you even upload your graphic or photo, think about what the file should be named. Instead of uploading it with a default filename like “IMG-0251.jpg” you should rename the file to something keyword-oriented (e.g. “Technical SEO infographic.jpg”) to give it a better chance of being featured in search result snippets (more on these below) for that keyword.
- Meta Copy: Once your media file is uploaded with a keyword-oriented filename, add meta descriptions and alt tags that also align with your keyword strategy. For any images that are linked, make sure those links also align with your internal link strategy.
- Compression: Whenever possible, you should also compress your image and video files for faster page loading–but make sure those compressed files maintain high resolution when rendered for different screen sizes.
2. Structured Data
While Google’s bots are smart enough to crawl websites and make their own deductions about content based on the format, copy, and keywords used, including structured data on websites helps search engines interpret the meaning of content based on standardized formatting rules, called schemas.
What is Schema Markup?
Schema establishes a data hierarchy that all websites can follow when marking up content on pages. It is described by Schema.org as a “collaborative effort founded by Google, Microsoft, Yahoo, and Yandex to create, maintain, and promote schemas for structured data.”
Cooking recipes are an example of web pages that might apply schema markup, because Recipe pages typically include data such as ingredients, cooking time and temperature, measurements, and so on. Were it a cooking product page instead, then schema properties for each Product page might include the serial number, model, price, and editorial or customer reviews.
Why are schema and structured data more important than ever for SEO? Because today’s SERPs are much different than the early days of search engines, when Google simply presented you with a list of organically discovered links that matched your search query. Today, websites listed in SERPs are competing with Google’s growing library of Featured Snippets.
What are Featured Snippets?
Featured Snippets consist of excerpted media such as copy, images, icons, reviews, graphics, and other content pulled from websites in order to be featured more prominently on SERPs among organic and paid search results.
In the old days, the essential goal of any SEO strategy used to be “get on the first page of organic search results.” That has changed with Featured Snippets, because now even the top search result is not as visible (and thus, not as valuable) as the content Google pushes into snippets, meaning featured content will more likely catch a user’s eye and draw web traffic away from the other results.
What helps Google determine the elements within a website’s content that should be pushed to featured snippets? How are its algorithms confident the content being pushed into snippets matches the search queries submitted? That’s right: Structured data.
Of course, not every website has structured data, so on those pages Google’s algorithm tries to deduct the meaning of content on its own. Google has also spent years building its Knowledge Graph (including schema, of course), so in many cases it can offer its own data to feature in snippets.
Still, it stands to reason that websites using structured data to help Google find and interpret their content faster will at least be rewarded in more organic search results, even if they’re not always pushed into featured snippets.
How to Test Websites for Structured Data
Want to check your website pages to see whether Google can pull rich media snippets from them based on the structured data you’ve implemented?
First, submit any website URL to Google’s free Rich Results Testing Tool and find out what (if any) structured data being used on your page is currently supported by Google’s rich media search results pages. If their tool doesn’t find any supported schema, you can view Google’s search gallery of structured data types to get ideas for which schema types your pages might be able to use.
If you’re not worried about rich results just yet, and you merely want to test your own schema markup code, you can use Google’s Structured Data Testing Tool and either submit the URL of your live web page, or submit the code snippet itself to validate it.
Of course, if you want to save time and let professionals handle all of this coding for you, contact a digital agency and work with their technical SEO experts to correctly implement structured data that optimizes the content on your web pages for targeted search queries and keywords.
3. Site Performance
There are many technical aspects of a website that affect its performance, but a few simple housekeeping tasks will help you maintain a clean, updated, and high-performing website that’s friendly to web crawlers:
Because your site map tells search engines what to crawl, it’s important you regularly check it to make sure there are no errors or redirects, especially when you have added or removed content from your website, or you are migrating content to a new site.
Of course, you should also make sure Google knows about your site map, either by submitting it to Google Search Console or by adding it to your robots.txt file (which should be done regardless).
As a general rule of thumb, your website should have as few pages (URLs) as possible to make it easier for search engine bots to crawl and understand each page.
Related Read: Site Maps: Your Top Questions Answered
Page Load Speed
While it’s usually developers who are tasked with making technical modifications that improve your website’s page load speed, even non-developers can still get things started using free tools to establish a benchmark and diagnose any page load issues that are slowing down your website.
For example, under the Behavior tab of your Google Analytics dashboard, click Site Speed and then Speed Suggestions to see Google’s internal-audit-based recommendations for improving the page load speed of your website.
Mobile First Indexing
A growing majority of today’s users browse and search the web on mobile devices, so Google recently announced that all new websites will be indexed based on their mobile websites starting July 1, 2019. Why would Google choose to do this?
Simply put, Google wants to be sure mobile users will have the same experience browsing your website, with access to the same content whether they browse using desktop or mobile devices. Google engineers often refer to this as “parity of content,” and you can achieve it by checking (and where necessary, remediating) the following for each of your web pages:
- Verify the mobile and desktop versions of your content have matching text, images, videos, meta description, page titles, and links.
- Add structured data markup, typically using Schema.org.
- Make sure your meta data (page titles, keywords, and descriptions) is relevant, concise, and strategic.
- Use responsive web design on all pages.
- Use matching URLs for each page’s desktop and mobile versions.
Progressive Web Applications
Progressive Web Apps (PWAs) apply coding and design standards to give your mobile websites the look and feel of a native app on multiple devices. They serve your technical SEO goals because unlike native apps, PWAs are essentially indexable websites, which makes the content within them visible (and thus, crawlable and rankable) to search engines.
Many digital marketers predict PWAs will eventually replace traditional websites as the “digital front door” for organizations, mainly because web browsing already happens more on mobile devices than desktop or laptops. But additionally, the development costs of Progressive Web Apps are typically much lower than native app development. Also, PWAs function on all platforms, and separate versions are not needed for different devices to render PWA content properly.
Accelerated Mobile Pages (AMP)
Following this idea that the future of the internet is web-based content and functionality accessed through numerous connected devices with varying screen sizes and processing power, Google introduced Accelerated Mobile Pages (AMP) to help developers code their websites to a standard that made them perform better, load faster, and render ubiquitous content with responsive design on multiple devices.
Even if you don’t have the skill set of a seasoned web developer, you can render and publish live AMP versions of your website pages using free resources offered through Google and others, including plugins for WordPress and other website content management systems.
Once you’ve created AMP versions of your pages, you can go to Google Search Console and verify they are not causing any errors or warnings:
If GSC identifies any AMP issues with your website, your web agency can remediate them and use GSC to submit those pages back to Google for re-indexing.
4. Digital Accessibility
Digital accessibility involves the practice of making digital platforms, including websites and apps, more accessible for users with disabilities or impairments.
There are 61 million Americans living with a disability, and they often browse the web using assistive technologies like screen readers. Still, an alarming number of popular websites do not meet basic web accessibility standards, which opens those organizations to potential ADA lawsuits because the U.S. Department of Justice has determined all websites must be accessible to comply with ADA.
RELATED READ: Is Your Website ADA Compliant? If Not, You Should Worry
But making websites accessible does more than remove the liability risk of noncompliance. Web accessibility also improves technical SEO, for a lot of the same reasons it improves user experience. Here are a few ways to leverage web accessibility for better Technical SEO:
Search engine crawlers view and navigate your website the same way a user would. That’s why improving site navigation for website visitors who use assistive technology also improves SEO by helping crawler bots navigate and index each of your webpages.
The best way to approach streamlining your website’s navigation is to start by creating content hierarchies, either by hand or using software. Your objective is to break down all your website content into broad categories, then make even narrower subcategories that website visitors can easily navigate to accomplish a set of common tasks.
Once your content hierarchy makes sense, compare it to your site’s current navigation hierarchies. If there are discrepancies, try and reorganize the hierarchy of pages on your website to match your newly designed navigation (your website CMS typically offers a simple way to organize page and menu structure).
You should also consider the benefit of offering navigation breadcrumbs on your site. Breadcrumbs are not just a best practice for optimal user experience, but also a type of structured data that improves SEO by helping Google categorize the content of your page in search results.
Accessible Meta Content
Meta descriptions are not only useful for pages, but also for individual pieces of content. Users who browse your website with assistive technology will appreciate it when they find accurate and helpful meta descriptions for your images, videos, and links, and crawler bots will favor your website accordingly.
Better meta content not only helps crawler bots and users interpret and consume media, but also further increases the likelihood your website will rank for targeted keywords. Additionally, this will optimize your content for the future of the web, preparing your site for when Google eventually pulls even more rich media to feature in SERP snippets.
Accessible Transcripts and Captions
When featuring audio and videos on your website, including descriptive captions and transcripts improves your user experience while also offering more keywords that help crawlers choose and place your website content on more SERPs.
While video captions were originally designed to help those with hearing impairments, today captions serve a much wider audience of users when you consider modern culture and how many of us regularly watch videos on our desktop or mobile devices with the volume turned low or muted. It’s yet another example of digital accessibility helping everyone, not just those with disabilities.
Related Read: Accessibility and SEO: A Perfect Fit
What is the Future of Technical SEO?
Beyond making your website more helpful and visible to both search engines and web users, Technical SEO will also better prepare your organization for changes and digital innovations in the future that will affect your strategic approach to acquiring more web traffic through organic search.
In fact, moving forward it’s more likely that rankings and featured content on SERPs will increasingly reflect what users do on websites more so than anything we do to optimize them for technical SEO. That’s because Google’s search results are increasingly factoring in contextual user data that’s only become available to them in recent years, including the device and web browser being used, the physical location from which queries are sent, and even the other content stored on devices sending queries.
Still, following established best practices for Technical SEO now will help you deliver the most optimized user experience possible today, making your content more accessible to a wider audience of web users and more discoverable to them (and bots) via organic search. Once more people are finding and visiting your website, your technically optimized content will help more of them convert and generate leads that help grow your business. And this can improve your rankings even more!
How can Technical SEO improve organic search traffic for your website?
At DBS Interactive, we’re leveraging technical SEO, web accessibility, and other digital marketing strategies to help our clients grow their business. Need some help optimizing your website to attract more organic traffic and convert more website visitors? Contact our SEO experts and let’s figure out your next steps.