My last few blog posts covered a topic near and dear to my content marketer’s heart: on-page SEO and SEO meta attributes. Now I’d like to introduce you to the technical SEO basics. Get ready for an introduction to the “under the hood” stuff that significantly affects the performance of your website — and by extension, its search rankings.
Technical SEO: A whole other language
With the right tools at hand (WordPress plus the Yoast plug-in is my choice and recommendation), a writer in partnership with an SEO strategist — or a writer well-grounded in SEO principles — can generate content that’s well optimized on-page for search.
On the other hand, technical SEO is by and large the domain of your website developer or dev team. They should be fluent in fine-tuning your site’s code for optimal technical performance. Unless you’re an HTML ninja, they should be the only people touching most of this work. Talk to them (or us) if you have any questions or concerns with your website’s technical SEO.
Technical SEO basics: The essential website attributes you should be optimizing
As you’ll read, each of these basics does its part to help your website put its best foot forward for search engines.
Technical SEO basic #1: Searchability.
Call me Captain Obvious, but Google, Bing, et. al. need to be able to search your website if you expect them to rank it. Take these actions to help search engines do their work to your benefit.
Be thoughtful about your website’s structure.
The easier it is for you (or other human beings) to understand the structure of your website, the easier it will also be for search engines. Aspects of an optimized site structure include:
- A clear hierarchy of pages, with well-defined content categories and relevant supporting sub-pages. (On our website you’ll find main category pages — Digital Marketing and Digital Workplace, each supported by subpages covering more specific topics.) If you find you’re creating an unwieldy amount of content for a particular category, audit that content to see if any of it is worthy of a new, relevant category.
- Internal/external linking and URLs that are named to reflect and follow that hierarchy, to streamline the robots’ journey through your site.
- No dead links. They’re as frustrating as hitting a dead end when you’re driving. On unfamiliar roads. And you’re running late.
- Balanced content across topics. This pertains to your basic website structure, but especially to your blog. (On the StitchDX blog, we strive to maintain a 50/50 balance of posts covering Digital Marketing and Digital Workplace topics.)
- Current content, with no outdated pages or duplicate content. Conduct regular content audits to see where updates, deletions, and redirects are called for.
Welcome the robots (but steer them where you want them to go).
Search engine robots (or “spiders” or “crawlers”) enter your website intending to scan and rank every page. However, your developers CAN code your pages to make the robots do as much as possible to get the search results you want:
With the robots.txt file, you can tell search engines which pages you do and don’t want them to crawl. Why would you want to steer robots away from any of your pages?
- If your site is quite large (e-commerce sites, for instance) search engines will allocate a “crawl budget”: A limited amount of time it will spend with you. You and your developers can strategize the use of robots.txt to steer robots to your hottest, most competitive products and away from, say, your “Contact Us” page.
- You and your dev team may also be trying to fine-tune (or overhaul) some key pages on your site. The robots.txt file can keep crawlers off those pages till they’re ready for prime time.
Then there’s the robots meta tag, which more precisely circumscribes the actions that crawlers can take at the page level. For instance:
- You can allow robots to crawl a page, but not list it on SERPs (Search Engine Results Pages). This is the “noindex” command, and it’s a snap to set up on the Yoast WordPress plugin.
- You can also instruct (“nofollow”) the robots to ignore all the links on particular pages, or just specific links. Again, Yoast makes it easy at the page level.
But… WARNING! DANGER, WILL ROBINSON!!!
This isn’t amateur hour — robots.txt and the robots meta tag are strictly the domain of back-end web development experts. Attempting to DIY this work can result in catastrophic results for your website.
If you see opportunity in these coding options, partner with your development team (Don’t have one? Let’s talk.) to weigh the potential benefits and create a holistic strategy.
Technical SEO basic #2: Mobile first
Google’s robots prioritize the mobile version of your website. Therefore your web design and development team must ensure that your presence on phones and tablets optimizes all technical SEO attributes to the max. This holds true even if the majority of your site visits come via desktops or laptops.
Technical SEO basic #3: Site speed
People don’t stay long on websites that load at a snail’s pace. Google will penalize your site in rankings if it delivers such a suboptimal experience. Google has actually quantified “page experience” with its Core Web Vitals. These break down into three performance areas:
- Largest Contentful Paint (LCP): The average loading time a page’s main content. We don’t want to wait for the information we’re seeking.
- First Input Delay (FID): How quickly a page responds to a user’s action, such as clicking a link. We click, we want the next action now.
- Cumulative Layout Shift (CLS): The stability of a page while it loads. We’ve all clicked on a link we think is there, only to be sent to the wrong page.
Your site’s performance in these criteria will affect its search performance.
Technical SEO basic #4: Security
In short: HTTPS or nothing. The search engines pay attention and rank secure sites higher.
Technical SEO basic #5: Structured data
The robot code options I discussed above give crawlers some behaviors to follow. Structured data takes this idea to a higher level.
Simply put, structured data tells search engines, “Hey! Look at this!” It’s specialized code or metadata that spotlights and amplifies specific content on a page with the aim of competitive advantage when it comes to prompting click-through. (More clicks can boost your SERP placements, which can lead to still more clicks, which can… you get the picture.)
You see structured data at work with practically every search you do, in the form of “rich results” that immediately draw your attention. Rich results can include images, a video, ratings, local-business particulars, customer testimonials, how-tos, FAQs, and on and on.
Schema.org is the authoritative source for you and your dev team to take advantage of structured data’s many possibilities. The Yoast WordPress plugin has begun to integrate some Schema.org coding, making structured data more accessible to the masses. I’m looking forward to seeing how this capability expands, and applying it to the customer sites I work with.
Technical SEO basic #6: XML site map
Even a galaxy-dominating search engine with brigades of mighty robots won’t turn down help finding its way to and around your website. And while submitting an XML sitemap to Google and Bing isn’t guaranteed get your site crawled, doing so does improve the likelihood.
In the simplest terms, an XML sitemap is a list of all the pages of your site that you want search engines to crawl. So what?
For example: In the structure of the StitchDX website, this page about our WordPress development services sits two levels below our home page. Without a site map, search engines have to crawl our home page and a page in between in order to reach and crawl our WordPress dev page. The deeper down in site architecture a page sits, the more work you’re making the robots do. The search engine intuits that humans will have to do more work, too — and penalize that page for suboptimal UX.
We submitted our XML site map to Google Search Console, so their crawlers can access the WordPress page (and every other page on our site) directly. For a site with a deep, multi-level architecture, the advantages are clear.
There’s so much more to technical SEO…
Each of the six essentials I’ve outlined above is multi-faceted and multi-layered. That’s good — you can never have enough ways to tweak your technical SEO for optimal search results.
But it’s also be a lot for busy SMB leaders to digest. Contact us to set up a no-obligation consultation about making technical SEO (and all aspects of your website) work better for you.
Learn how to do better in search:
Blog post: Google Ads vs. SEO: Which Is Better For Search?
Blog post: 4 SEO Keyword Research Strategies for Better SERP Rankings
Blog post: The 6 SEO Basics Your Website is Missing (and How to Get Them Right)
Blog post: Website Authority: What it Is, Why it Matters, How to Get it
Blog post: Page Authority: A Quick Primer for Your Website
Blog post: Link Building: An Introduction for SMBs
EBook: 10 Critical Questions You Should Ask About Your Website
Wondering how your website stacks up? Start with a FREE Audit.
In a few quick minutes you can learn your website’s strengths and areas to improve. It’s the natural starting point to your strategic SEO planning.
Ask these 10 critical questions.
Get our ebook, 10 Critical Questions You Should Ask About Your Website for additional perspective on assessing your site’s performance — and strategizing for better results.
We’re here to help.
Site structure, content strategy, design/development — the StitchDX team has a proven record of building websites that drive business results. Reach out anytime or click the orange chat button at bottom right; we’re here to learn about your challenges and introduce potential solutions.