Guide For Increasing Search Engine Optimization

11m ago
958.35 KB
12 Pages
Last View : 1m ago
Last Download : n/a
Upload by : Jamie Paz

Guide for Increasing SearchEngine OptimizationJuly 2019

Guide for Increasing Search Engine OptimizationThe Consumer Education Website Guide series aims to help state and territory staff develop effective,accessible, family-friendly consumer education websites. This series is designed to support the efforts ofstates and territories as they enhance their consumer education websites to help families understand the fullrange of child care options and resources available to them.These guides share best practices and tips that state and territory staff can use to improve the userexperience, make all information clear, and prepare for common accessibility barriers—such as limitedEnglish proficiency, limited literacy skills, and disability. They will help to ensure that all families have easyaccess to accurate, understandable information as child care consumers.OverviewThis guide introduces search engine optimization (SEO) to Child Care and Development Fund (CCDF) Administrators and otherchild care professionals who want to ensure that their state and territory consumer education websites effectively serve theircommunities. It presents tools and resources you can use to make sure search engines can easily locate content that is relevantto users’ needs; however, no one tool or resource is recommended or endorsed. Your state or territory can decide which optionswork best with its capacity and budget.The information presented here will help you understand and share the importance of SEO with information technology (IT)professionals and web or digital marketing vendors. It also provides information about best practices and technical approachesthat content managers and developers can use to improve consumer education websites. Examples and a glossary illustrate howkey terms and data points fit within the larger context of digital strategy, marketing efforts, and organizational goals.Search Engine OptimizationSEO increases the likelihood that target users will find web content through search engines. SEO works by affecting where contentranks in search engine results pages (SERPs). Most users visit only content that is linked in the top few results on the first SERP.For most websites, the majority of visits come from organic search referrals (in other words, the user runs a search and follows thelink directly to the site), most of which come from Google’s search engine. Therefore, SEO is almost always the most effective wayto increase web traffic. Beyond ranking higher in SERPs, many SEO improvements ultimately result in a better website experiencefor users.SEO is an ongoing process. While it will take some time to see results, there is potential for long-term rewards. The process canalso be flexible, meaning some SEO improvements can be made to the website over time, and others can be made later if budget,time, or access to development resources is limited.1

Guide for Increasing Search Engine OptimizationOverviewThere are two main types of SEO:Figure 1. Google SERP Technical SEO: What happens on the state and territory’s consumereducation website to increase search rankings. Content SEO: What happens on other sites to increase rankings onthe consumer education website, primarily through link building andsocial media marketing.The most important element of SEO is having useful, usable, andrelevant content, but this is largely part of internal content marketingand development efforts. Technical SEO (information about yourwebsite, populated in your source code and on your page) makes surethat metadata are built in ways that increase the likelihood that a searchengine’s ranking algorithm will rank one site’s content higher thanothers. Technical SEO also supports technical improvements like pageload speed and mobile friendliness.Metadata generally include the following: Page title Meta description Alt textSource: Google. (n.d.). Top search results for “best child care providersnear me” [Search engine]. Google Search. Retrieved from Google and the Google logo are registeredtrademarks of Google LLC, used with permission.Search engines crawl websites using a robot—an automated program that looks at all aspects of your website, including what ison the page and the metadata associated with that page. They then use algorithms to process that content and decide where itranks on an SERP. While not all aspects of search engines’ algorithms are public knowledge, much is known about how they treatmetadata and what kinds of changes have a positive effect on rankings.SEO work focuses on answering and then adapting content and metadata to respond to the question: What are users trying to door learn online, and how can we help them find more content that enables those actions?SEO Audit ProcessTo answer what users’ top tasks or questions are, look to website and search analytics for clues. For example, high-traffic entrypages (the first page visited on the website) and high-ranking referring search terms will indicate what users want most. Theseareas are low-hanging fruit for increasing traffic from organic searches, especially if they are already performing well withoutSEO improvements.Google Analytics, which is free, is one of the more popular website analytics tools. Google Search Console, which is a must-havefor search analytics, is a free service that helps monitor and maintain your website’s presence in Google Search results. It allowsyou to check indexing status (how well Google can view and understand your website) and optimize the visibility of your site.User research, conducted through interviews and surveys, is also useful in learning more about users’ content needs. Web contentshould adapt to fit those needs, and the metadata used to describe that content should accurately portray value to the user.2

Guide for Increasing Search Engine OptimizationBelow are the general guidelines for SEO workflow when auditing and improving technical SEO.1. Conduct analytics and user research to determine where to start improving SEO (in other words, examine the content thatappears to be most valued by an audience or search terms that are driving the most traffic to your website). Determine thepages you want to improve and note their current performance so that you can measure the impact of your efforts later.2. Define the scope of SEO improvements based on the research. Don’t worry about improving everything initially; first focus onthe most valuable content.3. Run a crawl of the site to inventory all URLs and their metadata. Free and low-cost tools can help you capture this information.4. Sort the URLs by response codes to identify broken links (for example, 404 errors) or other errors (for example, 500 server error,302 temporary redirect), which have a negative impact on SEO and user experience; pass your findings on to a web team to fixthe errors.5. Sort the remaining URLs based on the previously completed research (in other words, sort by traffic or otheruser-research-informed priorities).6. Audit metadata using the content manager best practices shared in this guide for each metadata type. If you have empty fieldsfor a specific type of metadata on an important page for your website, prioritize that as your next step.7. Write and upload new metadata where the audit revealed they are missing or do not meet best practices guidelines.8. Conduct a series of other scans using SEO tools to consider other technical limitations or issues with your webpages.9. Track organic search referral traffic to changed pages, so you can see if there has been an uptick compared to previousperformance. Results may take one to three months to take effect as pages are crawled. After six months, page-specifictracking of technical SEO impact likely is not needed to show impact, but you can make this decision based on when organicreferral numbers begin to plateau.Other SEO-Focused ProcessesAs part of the SEO audit, consider one or more of the following. Many of these items are explained in more detail later inthe document. Run the site through Barracuda’s free Panguin tool to determine if site traffic was impacted by any algorithm changes. Thisworks only if the site uses Google Analytics and may be most useful if traffic has recently declined. For example, in the summer of2018, Google implemented a page-speed update to their algorithm, meaning pages that loaded quicker could get higher rankings. If access to multiple tools is available, consider comparing crawl results (for example, 404 errors) across different tools to ensurenothing has been missed by another tool. Use duplicate content checkers to ensure that there isn’t repeated information on the organization’s own site or other websites,which Google will sometimes filter out from search results. Run a name, address, and phone number analysis, if applicable. This means ensuring that a site has the correct contactinformation on it and that other sites also include matching information. The more accurate and consistent these web listings,the better for SEO. This may include getting listed on Yellow Pages, Yelp, Google, Bing, Yahoo, Facebook, and other sites thatprovide users with contact or geographic information. Conduct a competition analysis of competitor sites that currently rank on the first SERP for terms in which the state or territoryis attempting to rank. Search Engine Journal provides a helpful guide for this type of analysis. Many SEO-focused software toolsprovide automated capabilities for obtaining data used in this type of analysis. Some tools will even provide a free spreadsheettemplate for running the analysis.3

Guide for Increasing Search Engine Optimization Ensure that a sitemap for the website has been submitted to Google Search Console and Bing Webmaster Tools. A sitemapis a list of all the pages on the website. It helps search engines quickly crawl (go through) and index (make sense of theorganization) the site, meaning that it delivers more meaningful results. Further information on sitemaps is found later in thisdocument, but here is a quick reference guide: 8?hl en. Ensure that a robots.txt file has been submitted to Google Search Console and Bing Webmaster Tools; robots.txt files restrictsearch engines from showing particular pages on the site in search results. You can find further information on robots.txt later inthis document, but here is a quick reference guide: 96?hl en. Check the speed of key pages on the website (including your homepage) and note any technical recommendations. Google usesspeed as one of its ranking factors, so it is important that the site loads in a reasonable amount of time.-- The following tools are free and open source:»» PageSpeed—strive for a score above 85.»» Lighthouse is also good for accessibility and checking against best practices. Check the mobile friendliness of key pages on the website (including your homepage) and note technical recommendations. Perform the above audit and other SEO-focused processes at least every six months. If page content has not changed, themetadata likely will not need to be changed for previously improved pages. If SEO best practices are incorporated into contentmanager workflows, any new content published since the last audit and improvements should already be optimized.Content Manager Best PracticesSearch engines are unique because they provide targeted traffic (people looking for specific content). Search engines are theroadways that make this happen. If search engines cannot find a site or add content to their databases, the site misses out onincredible opportunities to drive traffic. Many methods exist for improving SEO, and using several in combination will producebetter results. These best practices cover how to write meta titles, descriptions, and headings for each page to improve howrobots interpret and rank page content.Content Best Practices OverviewContent creators should generally do the following to improve SEO: Write titles, links, and headings that are accurate and descriptive. For top-level or high-traffic pages, write clear descriptions that summarize the page’s content. Don’t rely on a search engine’sability to pull this information automatically. Always write for people, not search engines. The timesof keyword stuffing are over and trying to “game thesystem” with buzzword-laden metadata can harm a site’sperformance in a search.Figure 2. Google Chrome Browser Use brief, clear URLs.The example to the right is the first result that appears whensearching for “Google Chrome.” The large bluetext is the page’s title, the grey text is the meta-description,and the small blue links are related pages within the site’sinformation architecture or sitemap.Source: Google. (n.d.). Single result on a SERP displaying title, URL, description, andrelated links [Search engine results]. Google Search. Retrieved from Google and the Google logo are registered trademarks of Google LLC,used with permission.4

Guide for Increasing Search Engine OptimizationPage TitlesKeyword stuffing will negatively impact SEO. However, keywords should be used in titles. Consider this example: Old: Figure Out Day Care with This Tool Filled with Helpful Data—Our Guide for Licensed Providers and Ratings New: Search for Day Care in DelawareBeyond keyword stuffing, the latter title is too long. Some guidelines for writing useful titles include the following: Keep titles not only under 55 characters but also under five words, if possible. Place keywords toward the beginning of the title. Don’t simplify the title so much that it creates confusion with other content on the site.Meta DescriptionsThe description—sometimes called a meta description or page summary—is a blurb that explains to search engine users whatthey can expect to find on a page. Sometimes, if a page lacks a description, the opening lines of the body content may appear.This often results in descriptions that are cut off, too long, or repetitive, with too many other descriptions on the site generated inthe same way. However, there is the possibility that no text will appear at all, leaving users and robots to judge a page solely by atitle and a URL. When writing page descriptions, follow these guidelines: Write in plain language—as always, write for users, not algorithms.-- Note: For more guidance on plain language, visit Build on the title tag (specifies the title of a webpage), making sure both make sense together. Be concise and keep the character count under 155 characters (including spaces). Try to place the primary word or phrase toward the beginning of the description.Heading LevelsAlthough they are not often pulled into SERPs, headings are an important factor of SEO, usability, and readability. Breaking upcontent with headings improves users’ ability to scan the content—they don’t have to read a large block of text to find the partof a page they need. Search engines also “scan” headings by looking for heading tags (for example, H1, H2) to determine whatcontent is on each page. Use H1 at the top of the page to display the title. Afterwards, use sequential heading levels (H2, then, if necessary, use H3). Do not use too many heading levels (for example, a page with minimal text will not need many heading levels). Do not use heading levels to format the appearance of text (for example, in place of bolding or other styles that should notbe headings).Note: Using headings properly is also important for accessibility. Assistive technologies like screen readers rely on heading levelsto allow users to quickly navigate the page by reading headings in order without having to fully read every paragraph first. Formore information on making your website accessible, use this designing-websites-are-accessible-all-families.5

Guide for Increasing Search Engine OptimizationEditing MetadataOften, a content management system (CMS) uses a plugin or module to handle metadata. This module typically appearssomewhere when editing a page’s record; the user interface and methods will vary widely depending on the site’s technicalspecifications and CMS implementation. The web development team should be able to help point content managers to wherethey can edit metadata. By default, many SEO modules use tokens to determine how to generate metadata and create automatedmetadata based on text on the page. The automated metadata may be more useful than no metadata at all, but it often generateslengthy meta descriptions that break mid-sentence because of character limitations or duplicate descriptions across many pages.Beware using automatically generated metadata.Examples of Revised MetadataThe examples below show meta descriptions that have been rewritten for improved SEO.Example 1.The page title was shortened, and the page content was split up to be more usable. Old title: How Can I File a Complaint Against a Child Care Center or Provider or Learn If a Complaint Has Been Filed? New title: How Can I File a Complaint Against a Child Care Provider?Example 2.Meta descriptions are not page content. They should describe the page’s content, not attempt to deliver it. Old meta description: The Child Care and Development Fund reauthorization includes requirements for health and safetytraining. This resource provides information on health and safety professional training for out-of-school time providers, asrequired by state, plus access to research-based online training modules. Trainings are free; it costs 5 for a professionaldevelopment certification of completion. This site offers modules on other topics too, such as positive guidance and preparingto teach STEM (Science, Technology, Engineering, and Math). New meta description: Information on health and safety professional training for out-of-school time providers.Example 3.The old heading displayed the full website’s name instead of what was on the page. Old H1: CCTAN—The Child Care Technical Assistance Network, a Service of the Office of Child Care New H1: Data Explorer and State ProfilesDeveloper Best PracticesThe following are IT-related changes that web development teams may implement to improve SEO.6

Guide for Increasing Search Engine OptimizationURL StructureURL structure refers to the way addresses for your pages are formatted. Well-formed addresses are often called “clean” or“semantic” URLs. These parameters can often be automated through a CMS to avoid inconsistencies or confusion, but the contentor web team should regularly audit URLs to ensure that any automated URLs follow these best practices: Use whole words; avoid strings of numbers, symbols, or parameters.-- Bad: sp?category lnl&id 0315-- Good: Limit keyword use and put terms that best describe the page at the beginning. URLs should be short to avoid SERP cutoffs and to make copying and pasting easy. Use hyphens to separate words (not underscores or spaces). For top-level domain names, users expect to see a URL that does not include special characters—otherwise, it seemsspam-related (for example, versus The top-level domain should match the purposeof the website (for example, .com for businesses, .gov for governments). These are more important for user experience andmarketing purposes than SEO.Mobile OptimizationFigure 3. Colorado Shines Collapsible Menuand Responsive ContentMobile optimization is critical for modern websites as more users access the web on devicesother than their computers. Therefore, it’s unsurprising that in 2015, Google began givingpreference to mobile-friendly websites. Take Google’s test to see a breakdown of whether a website is tools/mobile-friendly/. Do not have unplayable content that will be locked out on mobile devices (for example, avideo that causes an error message such as, “This video is not available on mobile.”). Do not have pop-up or overlay content that blocks the content or the user’s ability tocomplete actions on the page (such as a lightboxed, “Want to read more? Sign up for ournewsletter!” in which the “x” to close the box is inaccessible for mobile users).Another problem, which only faces unresponsive websites, is when mobile users are allredirected to the mobile homepage (such as from to, regardless of thepage they’re trying to access. Generally, it is better to optimize a full website for mobiledevices than to use a separate site.Because mobile devices are not as powerful as full computers, they access web contentmore slowly and sometimes on slower Internet speeds, which is part of why load time is soimportant to not only SEO but also to mobile optimization.Source: Colorado Shines. (n.d.). For families[Webpage]. Retrieved from

Guide for Increasing Search Engine OptimizationLoad TimeLoad time is the speed at which an individual page loads for visitors. Faster is better. Pages with seemingly insignificant delays cansee substantial increases in bounce rate, negatively affected conversions, and lower user engagement.The accelerated mobile pages (AMP) movement is driving more content to be designed to load faster on mobile devices. AMP is aproject from Google and Twitter designed to make mobile pages that are lightweight and fast loading. Content publishers shouldconsider implementing and testing AMP pages. At the very least, sites should aim for a 90–100 rating for both mobile and desktopdevices when using Google’s PageSpeed Insights test for maximum optimization.Load time is relevant to SEO because Google and other search engines factor speed into their rankings. Search engines don’t wanttheir users to be delayed, which could lead to frustration. Load time is also important for allowing robots to crawl pages easily. Inother words, because search engines need to index as many sites as possible as quickly as possible, preference is given to sitesthat help them achieve that goal. To address this, follow these guidelines: Optimize code and reduce the number of redirects because search engines penalize you for redirects, as they are extra stepsthat get in the way of your destination. Optimize images by not using files that are unnecessarily large; even if the image is set to display at a certain size or ratio, thebrowser must take time to load the full file. Design and develop pages as mobile-first, considering AMP versions where applicable. Avoid duplicate content pages.The last point regarding duplicate content pages is particularly important for increasing return visits and improving pages persession metrics. A browser can cache elements of a page, making it load faster at a later time. If a user visits the same contentfrom two different URLs, they can’t take advantage of this functionality. Their browser will have to take more time to load twoseparate pages, regardless of whether those pages have identical content.Canonical URLA canonical tag (also known as rel canonical) is a small piece of code used to tell search engines that a webpage is the trueversion of that page. If you have web content that is duplicate or very similar to other pages, search engines can sometimes getconfused about which page to deliver in its search results. You place the rel canonical tag on the page you want search engines todeliver in their search results. It looks like this in a site’s source code: link rel ”canonical” href ”YOUR URL GOES HERE” / . Moreon canonical tags can be found at ionSEO can help states and territories improve and increase the number of visitors to their consumer education websites. The tipsand techniques in this guide help states and territories understand the importance of SEO and show them how to improve SEO fortheir consumer education websites.You may use the methods in the guide as a starting point for state child care officials to better understand and communicate SEOwith staff or contractors who help write web content. This guide should also be shared with IT departments to guide them inselecting the best SEO options and to help ensure that families, child care providers, and other stakeholders can find their state orterritory’s consumer education website.8

Guide for Increasing Search Engine OptimizationGlossaryThe following are common terms used when discussing digital analytics and optimization for the web. Alt text: Metadata that is readable by a robot and describes a digital asset if the intended asset (such as an image) does notload or the user cannot see it (for example, due to impaired vision). Broken link: A hyperlink that leads to an error page where the intended content no longer exists (often a 404 error code page). CMS: Stands for content management system and is the platform that facilitates front-end expression of content to users ona website. Crawl: When a robot gathers information from the web by indexing and reviewing content. These robots are sometimes calledspiders, and they are used by digital professionals and search engines. Digital professionals crawl sites to find the informationthat other robots from search engines would see. The goal is to find opportunities to improve what spiders see and index sothat content is easier to find. Heading, heading tag, or header tag: Terms used to describe code that tells browsers which styles to use in displaying thecontent used in headings, which are text elements used to divide sections of content on a page. Spiders also use these tagsto understand the structure of a webpage. Examples of heading tags are H1 or H2 . Content managers should never useWYSIWYG (defined below) formatting such as bolding or increasing text size to create headings on webpages; instead, the siteshould have set heading styles, and the appropriate heading tag should be used at each level of page content. HTTPS: An abbreviation for “hypertext transfer protocol secure,” referring to the process by which data are sent betweena browser and the website it is viewing. HTTPS, unlike HTTP, is encrypted and is not only the modern standard for the web(users may abandon sites that are not secure) but also should be enabled to improve search rankings (HTTPS has factored intoGoogle’s algorithm since 2014). Keyword: A type of page-level metadata that is no longer recommended for use. However, including keywords not as metadata,but within relevant content, is beneficial to SEO. Search engines may punish sites that abuse keywords by giving them alower ranking. Link building: The process of getting other sites to link back to a site. Load time: The amount of time it takes for a page to load, from the time the user clicks or enters the link until the completion ofthe load in the browser (typically measured in seconds). Meta description: A type of metadata that describes to a robot what the page content covers. It can also be used in searchengine results pages to preview a page’s content to users. Metadata: Content that is coded into a web asset (such as a webpage or image) that is readable by robots and used in crawling,indexing, and ranking. Examples of metadata include meta descriptions, heading tags, and alt text. Organic search and organic search referral: When a user conducts a search through an engine such as Google or Bing and isgiven a link in the results that they follow to the site. Organic refers to the fact that these occurrences are unpaid and do notresult from pay-per-click search ads. Page title: This refers not only to the title of the page itself but also to the metadata used to indicate to spiders what the title ofthe page is; typically, the page title metadata and the title of the content on the page are—and should be—the same. Redirect: Method used to change where a user lands after entering an initial page. Robots.txt: This text file on a website tells robots how to treat the site when crawling it. For example, if there are pages thatshould not be indexed, the robots.txt file would tell spiders not to follow certain pages.9

Guide for Increasing Search Engine Optimization Search engine: A web tool used to conduct searches in the form of keywords or a phrase, sometimes input by the user as aquestion. The most popular search engines are Google (by far), Microsoft Bing, and Yahoo. Search engine optimization: SEO is the process of improving organic search performance by editing metadata (called technicalSEO) and adding valuable content that increases page rankings in search engine results. There are other more technical factorsthat also affect SEO, such as page load speeds. Search engine results page: The SERP is the page or pages that load after conducting a search with a search engine. Most usersclick links on the first SERP. URL: Acronym for uniform resource locator, which is simply the address of the page(usually looks like WYSIWYG: “What you see is what you get.” This term usually refers to content management systems that allow users to editcontent in formatting windows with “ribbons” that are commonly used in word processing software. WYSIWYG editing is aconvenient option to make content changes. It allows the user to see what it looks like as opposed to seeing a code version thata developer would see.10

State Capacity Building Center,A Service of the Office of Child CareAddress: 9300 Lee Highway, Fairfax, VA 22031Phone: 877-296-2401Email: [email protected] to Updates: sign-upThe State Capacity Building Center (SCBC) works with state and territory leadersand th

Search Engine Optimization SEO increases the likelihood that target users will find web content through search engines. SEO works by affecting where content ranks in search engine results pages (SERPs). Most users visit only content