SEO – Web Strategy for Everyone http://webstrategyforeveryone.com How to create and manage a website, usable by anyone on any device, with great information architecture and high performance Tue, 10 Jul 2018 21:28:16 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.8 http://webstrategyforeveryone.com/wp-content/uploads/2016/01/cropped-wsfe-icon-square-32x32.png SEO – Web Strategy for Everyone http://webstrategyforeveryone.com 32 32 Guide to Web Analytics – an introduction http://webstrategyforeveryone.com/web-analytics-introduction/?pk_campaign=feed&pk_kwd=web-analytics-introduction http://webstrategyforeveryone.com/web-analytics-introduction/?pk_campaign=feed&pk_kwd=web-analytics-introduction#respond Thu, 14 Jul 2016 20:20:08 +0000 http://webstrategyforeveryone.com/?p=291 This material on web analytics was left over when I wrote on the book Web Strategy for Everyone, a book now available from the publisher Intranätverk › This is an intro for you who do not yet know if web analytics is for you. Does your website have any visitors? Do you care about what …

The post Guide to Web Analytics – an introduction appeared first on Web Strategy for Everyone.

]]>
This material on web analytics was left over when I wrote on the book Web Strategy for Everyone, a book now available from the publisher Intranätverk ›

This is an intro for you who do not yet know if web analytics is for you.

Does your website have any visitors? Do you care about what the users do and whether they manage to achieve something beneficial during their visit? If you do not work at all with web analytics or measurability, one can wonder what the point is to publish anything at all. To work actively with web analytics is your chance to know if your website-related work and investments are meaningful – take that chance!

Glossary on web analytics

  • Web analytics – Methods to plan ahead, and follow up on how a website is performing. May be of a technical nature but usually it is about measurable factors of achievement.
  • Website statistics – Gathering of information about what users do on the website. Common tools for website statistics is Google Analytics, Adobe Analytics and Piwik, the open source option.
  • Visit – When a user visits the site. A visitor can make several visits and generate several page views.
  • Visitors – A unique visitor to the extent it’s possible to track individuals.
  • Page view – Impressions of a page or website. The number of times a page is displayed during a visit.
  • Pageviews / visit – Average number of pages viewed during a visit, aka a user’s session on a website.
  • Bounce rate – Refers to the percentage of visits with only one page view and then the visitor leaves the site.
  • New visitors – Indication of how many of the visitors who have not previously visited the site.
  • Direct traffic – The visitors who type the address directly in the browser, entering via a bookmark or similar.
  • Referring sites – The websites that link to your website.
  • Inbound link – A specific link on another website that points to a page on your website.
  • Organic Search Traffic – The visitors you attract thanks to the presence on search engines. Does not apply to bought traffic on search engines, such as through advertisements.
  • Page Load Time – The time in seconds it takes for a visitor to download all the material needed for a single page view.
  • Click-through rate (CTR) – What percentage of users choose to click on a link or button they have on display.
  • Web Performance Optimization (WPO) – The ambition to make a website perceived as fast by its’ users.

Keep in mind that what might appear as a unique visitor may well be the same visitor as a person who visited your website via a tablet, mobile, laptop and a desktop work computer. Unique visitors does not necessarily mean more than that it is a unique cookie on a browser – you can actually have several browser per device. The same skepticism can also be useful when you look at other data in your website statistics. Error occurs and there is no reason to accept anything without reflection upon what you see.

How to gather the statistics?

The activity of collecting information about visits on a website is called logging, sometimes also known as tracking. Some common variations on tracking are:

  1. A piece of code is inserted on all pages of a website. For each page viewed the visit is reported to a statistics service. In Google Analytics this code even has a name, ‘the GATC’ (Google Analytics Tracking Code).
  2. A most commonly invisible image is loaded on each page view. The image is loaded directly from a statistics system that this way is listening in on what a visitor does on the website.
  3. Examination of the web server’s log files. Many websites log every single page view, and each picture sent, etc., and the information can be valuable in an assembled form for fundamental web analytics.

The point of analyzing a website

You might ask yourself what is meaningful to measure? Besides quantitative trends of visitors, what is worth measuring is often the achievement of operational objectives. Measuring to see what is changing, what improvements that increase the conversion rate of visitors to accomplish the goals the business has with the website. To put it extremely simplified, the aim is to eliminate the barriers between your users and the goals you have with the site.

Eliminating the problems to get your visitors to convert

If you invest time and money in having a website it is well worth the effort to try to remove obstacles the visitors’ experience. Of course, it is not exactly trivial to make a website optimized for every possible use-case, but by working with web analytics you gain knowledge in what is worth improving.

Common usability problems include for instance:

  • Complicated processes checking out a shopping cart, to actually being able to reach the pay phase. Most often it is common that a majority of users disappear at the same place in a multi step process. Such a step have a  potential for improvement.
  • Parts of the site is inaccessible. Still some use forms, JavaScript and other obstacles to reach all of the information on a website. A search engine do not usually cope with such barriers, which leads to that it is difficult for a visitor to go directly to the target through a search engine result page (SERP).
  • Bad information structure. That it is difficult to find what you are looking for. Evaluate which information that is popular on your website and perhaps make adjustments making it simple to reach the highest priority content.

Being aware of your audience and improve your digital voice

There are several ways to get to know your visitors. Among other things by drawing conclusions about the keywords they use when searching. They do not necessarily use the terms you prefer to use.

They may prefer to navigate with menus, go straight for the search feature, or just click on links with an adjacent image.

By analyzing visitor behavior on a website, you can understand how your audience interprets the approach of the website. To evaluate what changes are bearing fruit you can employ something called A/B split testing.

A/B split testing

An A/B test involves defining a goal of visitor’s session. It can be to fill in their email address and subscribe to a newsletter. To evaluate which design proposal is most successful you have two different versions of the sign-up page for the subscription, which is shown to each user is randomly selected.

The version where most are enrolling is crowned the winner. A/B tests also provide valuable feedback for future changes of a website.

Preparing for campaigns and other potential high traffic peaks

Google Page Speed Insights for Boston Globe
One way to prepare yourself and your website for unusually many visitors is to evaluate your performance. Google Pagespeed is a tool to get started. Here reviewing The Boston Globe.

Not all websites can handle the onslaught that a successful campaign can cause. In theory, most of us probably think that we have the capacity to cope with what we can imagine.

The question is whether the media, counties and authorities were prepared for the information thirst at September 11, 2001, for instance? Or the tsunami of Christmas 2004? Or students applying for education through the Web on the last day of registration? Or all of us declaring our revenue via the Web just before midnight on the last day?

Most of us have probably tried to visit an established organization’s website and it has been down. By actively working with web analytics we can focus on efficient service, including sending the information necessary and at as low a cost as possible.

Examples of common mistakes in web performance:

  • Not optimizing published pictures, video and other media files for the Web. This can lead to a traffic jam that brings down your website.
  • Failed to put life expectancy of files, also known as time-to-live (TTL). If there is no TTL then every single file is sent again, even if the visitor already have the files in their browser’s cache. Simply put, your system design is inefficient regarding bandwidth.
  • Sending the information in an uncompressed format. Not rarely, when we fumble with GZip-compression it lead to a page becoming up to 20 times as heavy to download. Under normal circumstances it is not likely that you notice this – especially if you are connected to a high-speed connection at work. See picture above, from the Boston Globe, as an examples of a more or less temporary mishap regarding Web Performance Optimization (WPO).
  • Popular pages have a technical complexity behind the scenes and therefore is loading slowly. If you swiftly assemble a campaign that drives traffic to a page that requires a lot of webserver performance you’re gambling that it will stop working when the campaign is excelling.

Now we will switch focus a bit. A bunch of questions for all of us working with the content of website. To get to know your website and its’ users you have som homework to do. However, these metrics are not worth reporting upwards in the organization. They are for you and others who are actively working with the website’s content.c

Where are visitors coming from?

Aqcuisition through Google Analytics
Where are visitors coming from? In Google Analytics, you can see how users ended up on your website.

A public website are often in the range of 75-95% acquiring traffic through search engines. If it is below 75%, it may be that it has managed to gain valuable inbound links from important sites. Otherwise, the website can probably improve from a search engine point of view – read the introduction to search engine optimization, for instance.

Bounce rate per traffic source

To display the list of referral traffic according to evasive users is one of the most interesting views. Sort the column of bounce descendingly, which source of inbound links are most repulsive to its users?

Some websites that link to your website may not link to the right destination anymore, perhaps the some users are redirected to the home page. Or is the information being sought after not simple enough to detect?

Investigate the extreme case of bounce rates, that which attract a lot of visitors, and think how come the statistics is the way it is. Such as in the image above, why 77% of mobile Facebook users bounce.

Additional questions:

  • Missing any website which ought to be sending some visitors? Maybe some related organization should be requested to place a link to your website.

What keywords are used to find your website?

Keyword analytics: Keywords used on search engines
In recent years, Google has been increasingly secretive of which keywords used. But they remain, in another form, in Google Search Console.

In addition to learning which keywords that performs well, it is important to think about which words that are missing. The words may be absent for several reasons; they are not used, other sites out-perform yours in the search engines, or that the search engine is not capable of language you’re using (if not in English, this is unfortunately quite common).

Resources:

  • Google Keyword Planner can be useful to see which keywords are actually used, what competition there is and comparing popularity between different words.
  • Google Trends to see a historical trend between different words. However, Google Trends is not perfect for us mainly communication in other languages than English.

What keywords are used on the website’s search?

Google Analytics with Site search analytics
Many web statistics tools provides at least basic support for Site Search Analytics – a site’s own search function. At least Google Analytics does.

In addition to knowing what is popular sought after, it is relevant to at least to sample the searches that can be considered important or frequent. It is not uncommon that, according to the visitor, obvious concepts are resulting in poor search results. You should check the common queries and work editorially with search so that relevant content is accessible via the search function.

What content have high bounce rate, and why?

Bounce Rate per page
To have a high bounce rate is not necessarily bad, but it should be predictable. Is the point of a certain page to link externally all is fine and dandy.

That a page has a high bounce rate is not necessarily because it is bad. However,  often pages worth improvement is found when looking for a high bounce rate. One way to take advantage of this view of information is with Google Analytics, see Behavior -> Site Content -> All pages.

The default list view shows how the various parts of the website is used and is sorted by most pageviews. With a little sense of what each page on the website consists of, and is intended to be used for, one can wonder if the numbers are as desired. If not, what needs to be changed?

Consider this example on news on a website:

  1. News that has very little text and a link to another website. A topic interesting to a few.
  2. News with some text, multiple headers and paragraphs, images, and about a topic that applies to virtually everyone. Links to pages on the same website.

The first example probably have a lower average time on page and higher bounce rate. Simply because there is not that much content and visitors are enticed to click links to other websites.

The second one should have longer average time and a lower bounce rate because there is more content to absorb, and the links are internal within the website. In other words, there really is no right or wrong. What you should check for is whether visitors are actually using the website’s content as intended.

If putting great effort on comprehensive information and the webpages have relatively short visiting time there probably is potential to do better.

Why have we got more / fewer visitors?

In addition to obvious things that you changed address or archived large parts of the website the reasons are usually not that hard to find. First, one should look at what contributed to the change, for example if it is because of:

  1. Search Engine traffic
  2. Referring Sites
  3. Direct Traffic

Look for changes in trends, segmented for instance on source of traffic.

Google Panda and changes of algorithms

One example of algorithm that affected search engine traffic to a great extent for many websites happened during the summer of 2011. Google changed the order in which pages were listed on their search engine (the so-called Panda update). The aim was to provide more relevant results and leave out websites that were not considered to bring something of quality or unique. One practice that got punished  was copying parts of texts from Wikipedia or other established sites that Google already knew about.

A not entirely unusual phenomenon is that other websites affects your traffic to a great extent, at least sometimes. If a site with huge number of visitors links to your website it can, in addition to getting a lot of visitors, cause problems for the website to stay online and responsive. This is often called, among web developers, for slashdotting. Named after a very popular website, slashdot.org, that sometimes bring other websites down if they are not ready to take care of all the visitors.

Direct traffic is usually a very small part of a website’s overall traffic. Something we have seen in recent years is that the percentage decline as people started to use Google’s browser, Chrome. That’s because if you enter a URL in the address bar you’ll sometimes end up on Google Search – hence a common trend that more and more are searching for the address of one’s website. Moreover, people appear not to use bookmarks to as big an extent as in the past.

What external factors may have influenced the website?

Intelligence Events - Automated alerts in Google Analytics
Many web analytics tools has some way to show abnormalities, such as this in Google Analytics, called Intelligence Events.

Clues can often be quickly found in the website statistics automatic alerts. They assemble notifications of found deviations of past trends, such as suddenly gaining more visitors from a particular geographical area. This feature in Google Analytics has been around a couple of years and is getting pretty good in its precision (as in not bothering you with uninteresting details).

Do we reach out with the information we consider important?

First you need to figure out which parts are important and if they are placed in a way that makes it easily possible to find them. To reach out with important information the content needs to be advertised on relevant places on the website, preferably pages that already have a lot of visitors.

A common misconception is that most visitors will enter through the home page. Even though many actually do, it is still common that a large proportion never see the home page and thus may miss information that is only highlighted there. The argument “but we put it on the home page” may not be sufficient in all situations.

Click-through Rate (CTR)

In order to measure a news story, for instance, during the time it has been in the news listings and start pages we can measure its click.through rate. To check how many people actually chose a individual news story. Low click-through rate (to get visitors to click on the news when it shows up) can be caused by things other than that the users have not seen the link. Linguistic choices might make as big a difference? Inspect your content and you might find patterns of what works and what needs to be improved.

How is the website for mobile users?

How does the website perform for mobile users?
It is often meaningful to segment so you can see how a mobile user’s usage pattern is different from those on a computer, or a tablet such as Ipad.

Depending on how you take care of visitors using a mobile or tablet site may be perceived as more or less helpful. In the picture above it can be concluded that the site in question is not appreciated as much by the mobile visitors, at least they have a behavior of less page views, less time on the site and higher bounce rate.

Segmenting mobile devices
The usage pattern may also differ depending on the technological device a user is using. This may already be included in your web analytics tool.

As the image above illustrates, with the various mobile devices, it differs how a website is used depending on the device a user has. Partly it has to do with the screen size but also the quality that is on the screen.

With a tablet, it is easier to use a website that is not adapted for mobile visitors, at least compared with a smartphone. This conclusion can be drawn based on bounce rates and pages/visit when you compare Ipad and Iphone. Nowadays the difference is much smaller since most websites adhere to responsive web design.

Create custom user segments to compare the different user groups among themselves

You can use this feature to create custom segments to filter out only mobile visitors, and over time, monitoring how they behave on a site. Or create multiple segments to compare with Apple mobile against those with Samsung.

Compare Iphone and Ipad, side by side:

Iphone and Ipad with a responsive website
The presentation of a website can differ greatly between different kinds of devices.

Because of the mobile phone’s small screen, a non-responsive website is like trying to read the screens from a couple of meters away. The user need to pinch & zoom, they may struggle to hit the links and navigation is difficult when you do not have the overview. Unlike a tablet where the experience usually works decent but still not optimal.

Today, many websites follow the design principle of responsive web design. But still most websites is not equally good on all kinds of devices visitors choose to use. A change in chosing devices can make a big difference in how successful your website is.

How did our campaign perform compared to last year?

Campaign analytics in Google Analytics
Most web analytics tools is supporting comparing a campaign with a generic time period without any campaign.

By comparing the two periods with each other, we can draw conclusions about things such as seasonal variations, gift ideas or when the seasonal flu is likely to return. However, it is important to filter the parts of the website or service that is concerned, so we’re comparing things that are equal.

For example, you can measure how many visitors were reached through search engines with pages designated for specific keywords. In my case, working with healthcare, pages designated to topics such as the norovirus been present in the page name. This can be done using the tool for segmentation in Google Analytics. Then select the start and end dates for the two periods you wish to compare.

What parts of your site is perceived as slow of visitors?

Connection speed to a website according to Google Analytics
Google has, since a while, a couple of views on how fast a website is. Here with the view ‘Page Timings’.

One of the newer additions to Google Analytics is to monitor which pages are slow. No important webpages should take more than 2-3 seconds to load. Pages with several images takes longer to load since a single image often take longer to download than the rest of the content.

What do search engines think the website is all about?

Google Search Console lists which words are common on a website
Google Search Console is formerly known as Webmaster Tools. In Search Console, you can find out which keywords are commonly found on your website when Google’s bots are indexing it.

Search engines are often compared to a multi-handicapped person while being omniscient. It cannot understand the meaning of most images, figure out the video’s content or use advanced and modern on the Web.

However, search engines excel in tracking virtually all text on a website. In fact, a search engine probably knows more about a website’s content compared to the web editor. Through all the words the search engine finds it tries to understand what a website is about, if the website can be categorized and which keywords are more relevant than others.

Google Search Console lists the words the search engine thinks is relevant to a website. Not infrequently, one becomes surprised by the words that is common. Sometimes there are words which ended up there by technological mistakes when developing the website. It is preferable that popular keywords related to the website is high on this list.

High on the search engine results page but few visitors?

Search analytics with Google Search Console using keywords and CTR
Google Search Console can tell you if people actually click on your pages in Google search results.

There are many possible reasons for this. Most common is probably that it is a keyword few, if anyone, uses. To figure that out you can check Google Keyword Planner and search for the search term to see how many hits you get. Another reason may be that your information does not show itself from its best side in the search engine’s results page. It may be that many people see your link, but chooses something else. This requires that you work actively with your click-through rate, that is, to change something and follow up if you attracted more people, more who choose to click through to your website.

To measure this, you can use Google Search Console (see picture above). The CTR column indicates the percentage who click on your link in the Search Engine Results Page (SERP).

A common solution with great impact is to change the page title of the designated page, you know, the clickable text that pops up in the search engine. The title performs best if it is less than 70 characters, as early as possible stating the most important keywords and feels understandable to the person you want to attract. Check out tips from Moz.com about how to write efficient page titles ›

Tools

In addition to using Google Analytics, and other statistical tools, you can find out more details about what could be better if trying additional tools. Here are some suggestions.

Try some measurement tools

As an editor, few are expected to have more than a basic knowledge in web technology. Luckily, there are plenty of educational and simple tools that help us identify issues that can cause problems with the site you’re working with.

Woorank

Woorank with The Boston Globe to figure out what improvements can be made regarding webpage quality, best practice and technical factors
Woorank suggests a few things that can be improved with a website.

Woorank provides a review on lots of quality factors for a website. If you want a single number to compare sites then Woorank is a good option.

YSlow, Google Page Speed & WebPagetest

Webpagetest.org with The Boston Globe is reviewing the Webpage Optimization (WPO)
Webpagetest.org provides a waterfall to view files loaded. Also, you can select to test on an actual cellular connection to see how the site appears to real users (so-called RUM – Real User Monitoring).

Yahoo’s YSlow and Google Pagespeed controls how good the performance of a webpage is. This is especially important for mobile visitors, or to prepare for a campaign.

YSlow is added to the browser Firefox, but if you cannot install you can use Google Pagespeed directly in the browser. Or WebPagetest.org that has useful features like visual comparison of the two sites’ speed, and that they can measure over an actual cellular network.

SEO Doctor

SEO Doctor in Firefox - to find out the level of Search Engine Optimization of a webpage
SEO Doctor is a browser extension. It will help you get an idea of what can be better about optimizing for SEO.

SEO Doctor is an extension for Firefox and give a single value of how good a page is optimized for search engines. For example, it informs if a page has a too long page title, description texts are missing, etc.

Senseo

SenSEO in Firefox - nice for web analytics, but also researching keywords usage on a website
SenSEO is also a browser plugin that can support you with tips on where the content is not good enough from an SEO point of view.

SenSEO is an extension for Firefox that helps out if you want to check how individual keywords are performing in relation to search engines. The tool also discusses things that are difficult for an editor to influence, as to whether the address is made up of small or big letters.

Google Trends

Google Trends comparing kebab against hot dog - great for web analytics to choose the right wording
Google Trends can help you when trying to sort out which word is most established, or if what is most sought after.

Google Trends helps you get an idea of what word among synonyms or concepts that people use when they search on Google. The service is not spot on for smaller languages, such as Swedish, but for the most part it works well and for English it’s excellent.

Checklist for web analytics

  • Relate website content to the business goals, and attempt to measure visitors’ fulfillment of those goals on the website.
  • When starting or stopping a campaign (or make a major design change), make a note in your tool for web analytics. Then it becomes easier to follow up later on.
  • Try to find the information you published a while ago. How does it look? Is something in need of change?
  • Make sure media files are optimized for small file size. For images, you can do this with the service Smush.it
  • Consider using synonyms of important keywords. Synonyms are often used in headlines. To see which words to default to you use the tool Google Trends.

Not fed up about analytics yet? Late autumn 2016 a book, an anthology, about intranets is to be published – I have contributed with a chapter on Intranet Analytics. Check it out ›

Also, check out the extensive blog post about introducing Search Engine Optimization (SEO) for beginners. And of course, if you like my writing please consider buying the book Web Strategy for Everyone

The post Guide to Web Analytics – an introduction appeared first on Web Strategy for Everyone.

]]>
http://webstrategyforeveryone.com/web-analytics-introduction/?pk_campaign=feed&pk_kwd=web-analytics-introduction/feed/ 0
SEO – an introduction to Search Engine Optimization http://webstrategyforeveryone.com/seo-introduction/?pk_campaign=feed&pk_kwd=seo-introduction http://webstrategyforeveryone.com/seo-introduction/?pk_campaign=feed&pk_kwd=seo-introduction#respond Wed, 06 Jul 2016 07:04:42 +0000 http://webstrategyforeveryone.com/?p=264 This material was left over when I wrote the book Web Strategy for Everyone. Rather than throwing it away, it is now published with some editing – though not quite as ambitious as it would’ve been in print 🙂 Search engine optimization is all about optimizing one’s web presence for search engines to take a …

The post SEO – an introduction to Search Engine Optimization appeared first on Web Strategy for Everyone.

]]>
This material was left over when I wrote the book Web Strategy for Everyone. Rather than throwing it away, it is now published with some editing – though not quite as ambitious as it would’ve been in print 🙂

Search engine optimization is all about optimizing one’s web presence for search engines to take a liking to your site. It is often abbreviated SEO. Something quite different is keyword optimization – the editorial part of the process of SEO.

Search Engine Optimization ≈ Make your website’s content great according to search engines

Search engine optimization, also known in short as SEO, is your attempts to attract visitors to your website by making your site as good as possible in relation to search engines and their users. It is possible to divide SEO in lots of niche terms, but I will restrict myself to make a difference between search engine optimization and keyword optimization because, in its basic form, it’s about either technology or linguistic. This post will therefore not take up the subject about search engine marketing (SEM).

Search engine optimization is basically to have great content. What‚ then, is good content? The following features are a good start for the content you wish to attract users to:

  1. Unique – same content is not available on other sites (such as through syndication / RSS, etc.).
  2. Suitable length – short product texts are regarded as facts.
  3. Appreciated by others – has links to itself, and many visitors (Google has access to your visitor statistics if you have Google Analytics).
  4. Posted on a website that has subject-related authority – to more information about similar things are.
  5. From a trustworthy source – the author is an authority in the hen write about?
  6. Accessible – well put, not long chunks of text, and preferably marked up with RDFa/Microdata techniques such as Schema.org so even machines understand what the content is about.

Search Engine Land has an excellent list of the SEO’s periodic system, where you can see more positive factors.

There are also bad signals, among other things:

  • Many different authors with few posts per site. At least Google does not like guest posts or suspected sponsored posts. Many writers can be an indication of a suspicious behavior.
  • Outlinks to pages with low credibility, or links that lead to error 404, or hacked websites.
  • Too many links. Offering hundreds of links in your content exhibit an inability on priority, everything cannot be a priority.
  • Slow page views. Both search engines and users have better things to do than to wait for a slow website.

Sitemaps

The structure of the website needs to be good enough for a crawler to be able to look around, but there is still a point to submit a sitemap to the search engines. This is done by Google through their tools Search Console and Bing with it is their Webmaster Tools that apply.

SEO best practice changes over time

A common mistake among those who do not work with SEO very much is to believe that it does not change very much – it does, a lot! We notice that especially more and more. For instance on the books released about SEO that nowadays are named with the year of release, and the authors trying to delist their books a couple of years after they are released. This is the reason why I did not include a chapter on SEO in my book Web Strategy for Everyone, the subject-matter is to agile for print.

A common misconception is that it is beneficial to enter keywords, so-called meta-keywords. It is still widespread, but there are many more examples that it is important to keep on top of the updates in SEO.

The reason is attempts to spam Google…

For example, Google is fighting a constant battle against search engine spammers attempting to lure in visitors. The problem Google is trying to solve is to display only relevant hits from the original source of that which is sought after. For instance, it’s not supposed to be meaningful to build a site which largely contain materials snatched from Wikipedia, then Wikipedia is to be displayed in the list.

If you want to learn more about these updates, you can visit SEO forums, or search for animals names Panda and Penguin which were the first two major upheavals in SEO. While you’re out there and googling you can also read about the Google algorithm Hummingbird, its aim is to understand a search query’s context and provide sensible synonyms and related searches. This is what the book Web Strategy for Everyone referred to as Web 3.0, or, the Semantic Web. An attempt to make machines understand content and context, and to bridge the different data sources differences in structure.

Everyone needs to know some SEO!

The reason why all website owners, editors and bloggers need to learn a lot about search engine optimization is that otherwise they avoid customers and visitors. Compared with the physical reality a optimized site shop is placed on the street where its customers are frequently visiting. The website that is not optimized at all is like having a department store on top of Mount Everest, surely a cool concept store, but there are not many customers in the vicinity.

Keyword optimization is to make your website attractive to search engine’s and to clearly position the keywords your audience uses so the website will be easy to find. Besides some initial technical measures it is eventually those who handle the content the website shows up who have the greatest impact on the site, making for it to be keyword optimized.

Somewhat simplified, one can say that only text is searchable because it is difficult for search engine’s software to understand the content of image or video without it offering descriptive text. Although the machines are getting better at understanding images, video and speech, it will be a while before we can take a pass on the production of texts.

Glossary of SEO

  • Keyword Optimization, SEO (Search Engine Optimization) – The activity and know-how on how to improve websites to be easily found through a search engine.
  • Search querys & keywords – Usually one or more words that a user types into a search engine in the hope of finding something relevant.
  • Search results & SERP (Search Engine Results Page) – SERP is often used by those already initiated into the search engine industry and construed as Search Engine Results Page. On the Swedish search results page, that page of the search engine in response to a search term.
  • Visitors – When referring to a visitor of a website, it’s about all the visits provided. Much like the visitors to a business, some are recurring others are not.
  • Unique visitors – A unique visitor is different from a visitor by that we are able to identify them as an unique individual, able to perform multiple visits to a website.
  • Page views – Number of times a visitor viewed a page on the site, or the number of views for a certain page.
  • Unique page views – Number of unique visitors who viewed a page. For example, it is not unusual for a visitor to visit a single page multiple times, which is a single unique page view.
  • Index – The search engine’s index is what it knows about and what its users can search within.
  • Bot, crawler and spider – Is the search engine software searches the web for new, changed, and deleted material as pages, images and other types of documents.
  • Inbound links – the links pointing to your site. So other sites that link to yours.
  • Keyword Density – The percentage of text that consists of relevant keywords.

Editorial keyword optimization

Wording and editorial

The visitors you want to reach are those who are actively looking for something your website offers. Therefore, it is very important that you are familiar with how your audience express themselves and what search queries they use.
Are they looking for ‘gore-tex hiking boots’ or ‘The North Face Hedgehog GTX 2016 edition’? While both search queries perhaps are describing the same product, someone searching for one of them does not necessarily get to see the other one in their search results.

It is paramount to be aware of the words used so there is no risk that extremely rarely get a match between the words you have chosen to use and a search engine user accidentally enter to search.

Usually it is a combination of several possible words that constitute a search term. Try to naturally get the most commonly used word for each sub-page of a product. Later in this section is useful tools like Google’s keyword tool you can use to find out how many people use certain search criteria.

Heading! = Headline

Keep in mind that there is a big difference in a heading and a headline. A heading is more of a name for something while a headline is descriptive of what you can expect from the content.

Sample heading:
Tale of Little Red Riding Hood and the Wolf

Sample headline:
Wolf ate grandmother

Of the two examples above, only the last one is wise to use a large extent on the Web. The heading indeed talks about the kind of information – a fairy tale – and who is involved – Little Red Riding Hood and someone / something called the wolf – but it summarizes in no way what is to come. The point of instead writing that a wolf ate grandma is to summarize all the text has to tell. This way of obfuscating the content is all too common.

Writing for the Web is to write what is most important first. A descriptive title, a summary preamble when necessary and that the most important in the body comes first and then go into the details and references to learn more.

Writing page titles

Page title on The New Yorker
The page title on The New Yorker articles is the same as the headline.

The page title is the name of the page and is often reflected in the top of the browser window (at least on computers). The page title is also the clickable text in the search engine’s result page and the name in a browser’s bookmarks.

Often, the page title is the same as you choose to name the webpage. In some web content management systems you can manually write whatever you want for the page title and some other phrase for the main header of the page.

There are two common variations of how page titles are written. What divides them is whether the site’s name comes first and the unique in the page’s name comes first. To be unnecessarily obvious, two examples below:

  1. Weekend trip to Prague – Travel Company Inc
  2. Travel Company Inc – Weekend trip to Prague

In example 1 above, the unique information comes first, which means that the words weekend trip to Prague is more keyword optimized than the company name.

Example 2 focuses on to clearly tell who the owner of the site is – at the expense of the actual content of the individual page.

Before choosing the syntax of example 2 it is good to think through whether it is worthwhile to compete to appear on one’s brand. It is often enough the case that users are looking for what a business does or offers rather than the company itself. Moreover, it is common to be ranked really high in search engines if users are searching for one’s brand. So I would absolutely recommend example 1.

In fact, some take this to such an extent that they only write the page name in the page title. After all, there are URL nearby on a search engine and it usually specify who the sender is.

It is worth bearing in mind that the length of a page title, if possible, should be a maximum of 70 characters. You will not get much more space in the search engine result page and, also, it is probably not that common to read much more than the first few words. Yes, users are skimming, almost all the time on the Web.

Headlines

In order to clearly divide a body of text and have a good readability, we as readers need information structure. For this we use headlines that describe the body of text below the headline. It is not enough to format a header with bold large text – the headlines must have the correct HTML code to remove all doubt that it is a headline. The same applies to the headings.

If possible, it is recommended that all important subpages has a headline and at least one sub-heading of level 2. These titles will come in the right order, sizes, in descending order, without any level is missing.

If I had a dime for every time a web editor explained why they did not use the correct heading levels, or real headlines at all, it would at least paid for a family pizza. The most common argument is that the text becomes too large, the different color, change the font or is in uppercase. It is a problem they should rather take up with their web developers.

Words in a headline are worth more than those in the subsequent body text. This is because it is more prominent. Actually, it’s logical. When sighted, blind and machines skim the text through the headlines as they act as entrances to the text which is between headings.

Alternative text for images (and other media files)

Alternative texts are for describing media files content for those who cannot perceive the content, such as those with deficit sight, the blind, deaf – and that includes search engines!

Putting alternative text on images is really for the visually impaired and those not loading pictures. The text is meant to explain the image, which for those not loading pictures can explain which of the pictures are worth to load at all.

The reason why some choose not to load images in their browsers are often due to a dubious internet connection, or perhaps that it costs money on most cellular connections, or to avoid advertising.

Although alternative texts will be used primarily to explain the images’ content for those who cannot take part of the image you have the possibility to use other terms when writing those almost invisible phrases. Consider using synonyms and suitable words that is not present in the body.

Body text

That a page contains more than images is important because otherwise there is no text to search for. It is considered by many SEO professionals that there is a critical minimum amount of text – which of course is difficult to put a static figure on. Then there is the problem search engines are increasingly trying to avoid the so-called content farms. A content farm is a website containing pages with either very targeted text and commonly content that is not unique, for example snatched from Wikipedia.

In addition, the body is the ultimate place to have a good breadth of searchable keywords. You can also put some effort with something called keyword density, thus using the same words many times over.

However, website copy is for the benefit of the reader in the first place. It ought to be a legible and comprehensible text, to please the search engines is secondary. So, when you achieved legibility you can care about SEO. Such as the body text being at least 300 words, a level that is constantly raised according to most SEO in recent years.

Bullet points and numbered lists

Words that appear in a list on a web page is considered to be of greater value than that which lies in the body text, at least compared to text in ordinary paragraphs. Besides lists visually guides the reader in a body of text – regardless of whether the person uses a screen reader or not – also for search engines lists stand out, often with containing relevant facts or summaries. This for the simple reason that lists visually stand out from the other content.

If you have a text that lists things also format the content as an ordered or unordered list. It also makes text easier to absorb, and lists are a great way to not scare the visitor away with a massive wall of unstructured body text.

Inbound links and URLs

Having many quality inbound links, especially from known or trusted sites, is important to show that the website’s content is worthwhile linking to.

Many tend to register their sites in different link directories, exchange links on webmaster forums or email their partners to get links. To a limited extent, it is still worthwhile to get the website listed in these contexts. But remember that words in the clickable text of the link is associated with the site. Therefore, send a thoughtful suggestions so the anchor text don’t become ‘Check their site here …’

Descriptive URLs are also relevant in SEO, only since they sometimes become the clickable text when some post the URL in forums, or commenting on the Web. Not to mention that it is good for those who stumble across an URL to be able to figure out what is referred to.

Example of a descriptive URL, also called “friendly URL”:
webstrategyforeveryone.com/performance-for-intranets/

Example to the contrary:
www.gp.se/sport/1.565873

The addresses of your pages are worth trying to keep as brief as possible – but still descriptive. This is something that many Web Content Management systems (WCMs) struggle with. Such as the URL syntax in Episerver, which my employer, Region Västra Götaland, uses, becomes absurdly long sometimes:

www.vgregion.se/en/Vastra-Gotalandsregionen/Home/
Healthcare/Public-Health-/Public-health-policy-Vastra-Gotaland/
Living-conditions-based-on-equity-and-equal-opportunities/
Region-Vastra-Gotaland-is-developing-a-system-to-monitor-health-
inequalities-/

Try typing that correctly when read to you over the phone…

Some addresses are so long they become uncomprehensible. Also there is the risk they do not work when sending e-mail, or when posted in forums since many system cuts “words” that are too long and inserts spaces (in order not to break their design).

Website speed – Web Performance Optimization (WPO)

Google Search Console
Google Search Console is measuring the time spent downloading webpages from your website.

That a website is loaded swiftly has become a more and more important argument in recent years. Above is a picture from Google’s tool Search Console, one of many ways to keep track of your websites’ performance. In the example, it takes an average of 0.61 seconds to load a page which is not great but also not alarmingly bad.

The speed of a website is one of the factors that influence whether a website is ranked high on search engines. Therefore the website speed is not only an usability metric, considering users converts to a higher degree on fast websites, but also since it’s one of the ranking factors of Google since a couple of years.

If the website has poor performance it is commonly one of the reasons below:

  • Slow web host. Is the site on the website host’s budget class? Then you cannot  expect particularly good performance – at least not for a website based on a content management system like WordPress, etc.
  • Poorly optimized website. WordPress is among the most common systems nowadays. The problem with WordPress among others is the ease of adding plugins that negatively affects performance. Often the website get an additional stylesheet and an extra Javascript file. Also, the plugin is of varying quality. Evaluate whether there are better versions of the plugins you need.
  • Too many, or too big, files. To upload lots of pictures of high quality and have lots of Javascript features affect performance significantly. A user usually cannot download more than 3 files at once from your website. If you have more than three files in total, a queue is there for the remaining files. That queue is fairly long on most websites, unfortunately.

A lot of tricks and my own experiences regarding performance optimization is available in Web Strategy for Everyone.

Common solutions to performance problems are:

  • Optimize images so they do not take as long to download. An image file size can be optimized in Photoshop, for example using its feature ‘Save for the web’, using applications such as Imageoptim for macOs, FileOptimizer for Windows or web services as Smush.it.
  • Ask the web developer to combine stylesheets and Javascript into as few files as possible. The Javascript library Jquery is often used on websites. Instead of having the file on your own website, you could use Google’s so-called CDN (Content Delivery Network). If you have many Javascript files you can look for CDNs offering to serve your unique files as well. This is often free, perhaps you’ll like Cloudflare (as much as I do).
  • Change web hosting. Often small hosting companies are faster than the big ones for about the same cost. How come is the topic for another blog post, though. If you have plenty of money for your web project, maybe consider a Virtual Private Server (VPS), or even a dedicated server. If so, this is the time to talk to web developers or involve an experienced web consultant. If you’d like my advice just post a comment to this post.
  • Evaluate the code behind the website and review any plugins. Not infrequently, there is code that can be improved and other plugins that provides better performance.

Meta description of pages

Description meta-section of the HTML code
The meta description of a website sometimes appears in search engine results page, its ‘SERP’.

The description, or ‘meta description‘ for the technically minded, is a, to the browsing user, hidden text that briefly summarize what the webpage is about. The meta description should always be unique to each page of the website, otherwise there is no point in having them.

The picture above shows one of the few cases when a user can view the meta description text. Next to the page title when showcased on a search engine’s results page, its SERP (Search Engine Results Page). The meta description is supposed to summarize the content of a webpage and need to be well-written to attract the users to your website.

The meta description is placed in the source code between HTML tags <head> and </head>. It may look like this:

<meta name="description" content="List for those who want 
learn the latest on substance X. Here you will find advice and
tips from the professionals X. "/>

If you’d like to see what your competitors are using for their meta description texts, choose to view the source code for each webpage, and early in the code, you will find something like the above.

Keywords (is really not that important in external SEO – the last 15 years)

Keywords in the HTML code
Keywords are not used on Google, but can make use of them in your site’s own search function.

Keywords was once important to reach out in the search engines. They are still used today in some search engines. But because of the proportion who use Google as their search engine, and since Google does not care about keywords, it is not always worth the effort to work with keywords on your website.

The reason that Google, and probably other search engines, do not care about keywords is since its ease of manipulating the keywords in a massive scale. Spamming search engines with keywords have simply led to them not considering its use any longer.

If you still choose to work with keywords (mainly for your own internal search engine) you are recommended not to put all general words on all pages. Try to keep it unique for what each page contains.

For large organizations that have their own search engine, or a internal site search, keywords can definitely be worthwhile to be used on their own search engine.

If you’re looking for the keywords in the HTML code, look between <head> and </head> like in the code below:

<meta name="keywords" content="keyword1, keyword2, etc." />

Validation

The website’s underlying code is supposed to validate the code standard it is claiming to adhere to. Anything else is sloppy coding. However, all mishaps are not critical to how well your site performs in the search engines. Still, everything that is easy to fix (or deemed serious) is good practice to fix. If your website’s code is not even close to validate it can prove difficult to interpret for search engines and other machines you rely upon.

To find out if a website validates according to the web standard, you can go to the web standards organization W3C and enter the address you want to check – validator.w3.org

Useful tools and methods

Create a list of priorities words and search terms

Open a spreadsheet and enter the keywords you think are relevant to your website. Also, rank them in their relative importance and how descriptive they are.

Using Google Keyword Planner, see below, you can get help and insight regarding accompanying words used and see how much volume the search terms have.

Google Keyword Planner

With this tool you can enter the keywords you want to find accompanying words and see the search volume for the search term. Really good when you need guidance on the wordings and expressions to choose when writing web copy.

Keyword Planner is part of the Google Adwords platform, but it does not require an account or active keyword campaign to dig into the keywords. Try Google Keyword Planner ›

Plugins for your web browser

SEO Doctor
The Firefox plugin SEO Doctor helps you to evaluate the search engine optimization of webpages.

There are lots of browser plugins, at least for Firefox, that help you out with search engine optimization. Two of the most appreciated for novices is SEO Doctor and SenSEO.

With SEO Doctor, Firefox can review every webpage you browse, even your competitor’s webpages. Clicking to open the SEO Doctor plugin shows a dialogue with details explaining the score. You also get hands-on guidance on what can be improved with the individual page, such as including a sub-heading or meta description for instance.

SenSEO is an extension for Firefox
SenSEO: This example shows how good a website is tailored to the word chocolate pastries.

SenSEO is advantageously used to evaluate how a particular word is optimized on an individual webpage.

In other words, when you have found a word you want to rank better in the search engines is then SEO tool to see what can be changed.

Performance Measurement

Göteborgs-Posten according WebPagetest.org
My local newspaper, Göteborgs-Posten, takes, according WebPagetest.org, 13 seconds to load.

There are many online services that assist in analyzing how quickly a website can present itself to the users. If you are not inclined in web tech, it is worth talking to a web developer to get some perspective on the reports.

One of the services is WebPagetest that looks like the picture above. Another is Google Pagespeed Insights, another is Sitespeed.io for those feeling a little more technically minded.

Validation

W3C is the organization that writes the recommendations for many web technologies. They offer a service to see how well a website adheres to the standards they’re claiming to follow.

W3C has a good validation service to minimize the amount of incorrect code
W3C has a good validation service to minimize the amount of incorrect code. My local newspaper, Göteborgs-Posten, is not receiving top marks …

Above is how well GP.se adheres to the code standard they chose. Unfortunately, not that flattering. Check out your own website with the W3C ›

Checklist for editorial SEO

If your should you make a checklist about SEO for your web editors, or when you write copy for your website, then check out the below suggestions.

Below is a checklist in order of priority, for which effect can be achieved through SEO.

  1. The page title contains relevant keywords and is shorter than 70 characters.
  2. The main header (H1) contain important (and possibly unique / new) keywords, or supplemental synonyms the website needs.
  3. Subheaders (H2) contains relevant, or use supplemental, keywords.
  4. Thoughtful keywords are included in links’ anchor text on other websites, those who link to your website.
  5. If there are images, they have suitable alternative texts.
  6. Content that is suitable for lists are placed in lists (UL / OL).
  7. Important keywords are one or more times in the body, preferably also early in the text. Ideally some synonyms as well later on in the text.

More on Search Engine Optimization

Also, check out the book Web Strategy for Everyone ›

The post SEO – an introduction to Search Engine Optimization appeared first on Web Strategy for Everyone.

]]>
http://webstrategyforeveryone.com/seo-introduction/?pk_campaign=feed&pk_kwd=seo-introduction/feed/ 0
Web Strategy for Everyone released today http://webstrategyforeveryone.com/web-strategy-everyone-released-today/?pk_campaign=feed&pk_kwd=web-strategy-everyone-released-today http://webstrategyforeveryone.com/web-strategy-everyone-released-today/?pk_campaign=feed&pk_kwd=web-strategy-everyone-released-today#respond Tue, 17 May 2016 12:16:31 +0000 http://webstrategyforeveryone.com/?p=163 Yes, today is a great day. For several reasons. Not only is it two years ago, the Swedish original edition got released, but also that the English book is released – Web Strategy for Everyone. The icing on the cake, May 17 is also the neighboring country, Norway’s, national day which makes this date quite …

The post Web Strategy for Everyone released today appeared first on Web Strategy for Everyone.

]]>
Yes, today is a great day. For several reasons. Not only is it two years ago, the Swedish original edition got released, but also that the English book is released – Web Strategy for Everyone. The icing on the cake, May 17 is also the neighboring country, Norway’s, national day which makes this date quite easy to remember.

You can lay you hands on the e-book today, it is sent to you shortly after you check out your order. The printed book, though, will take a few weeks before it is sent. The e-book’s formats are ePub, PDF and Mobi, they cover virtually all mobile phones and tablets, and Mobi is specifically for you with an Amazon Kindle. If you’re looking for a different format I can recommend that you download the program Calibre, then you can convert to many more formats yourself.

The best offer is, I think, to purchase both the e-book and the printed book. It costs about 25 $ + VAT. Then you get 90% discount on the e-book.

Order the book at Intranätverk – from 10 $ + VAT ›

What is Web Strategy for Everyone covering?

It is 60,000 words and 80 pictures spread over 212 pages. The amount of images and the layout makes it not so burdensome to read, like many other similar books. At the same time it is written in a very ”condensed” manner and to the point, so it’s like most Swedish literature – rarely repeating itself over and over again. Or, as one reader put it:

”Recently read your book and it is an achievement how much value you have managed to put between the covers. Very good!”

– Håkan Liljeqvist, founder at Kreejt

The subject of the book may seem obvious given the title, but at the same time there is no real definition for what is meant by ’web strategy’. So I chose to make the book to cover the fundamentals in many areas a web strategist must know about. Partly, the Web’s history, a lot on information architecture, different approaches to web design, some about the increasingly hyped topic of web performance and last a do-it-yourself with hygiene factors to check on your own website.

Not just a translation from the Swedish original

The book is not really just a translation of my Swedish book. The English book has more international image examples and it turned out to be unexpectedly many cultural references that non-Swedes would hardly understand. So pretty much of the book is rewritten for an international audience.

Interestingly, several Swedish friends and acquaintances are waiting to buy the English book, despite the fact that the second edition of the Swedish book have been out for several months. Many Swedes, myself included, are probably more used to read in English and may have more benefit of an English version when in a multilingual business environment.

Better English in the book compared to what I write in the blog

A first turn of the translation work was done by a Englishman who is very knowledgeable in web development and intranets. After that, a language agency also had their way with the book to raise the quality even further.

Needless to say, my publisher has really invested in the book and I think it’s going to be great even for you guys having English as you with English or American as your native language. For obvious reasons we can not afford to make the same effort with every blog post I write in this blog, so do not be too quick to review its language if you find peculiarities in my blog posts.

Table of Contents

Since we’re not going to judge a book by it’s cover over the Internet I’ll share the ToC. Perhaps this content gives you an idea if you’d make use of reading the book.

  • Before we begin
    • Why you should read this book
    • About me
  • The Web’s history and future
    • Web 1.0 – a network of documents
    • Characteristics of Web 1.0
    • Web design 1.0
    • Web 2.0 – the engaging web
    • Characteristics of Web 2.0
    • Web design 2.0
    • Web 3.0 – a network of data (also known as the semantic web)
    • Characteristics of Web 3.0
    • Web design 3.0
  • Information architecture
    • Content choreography
    • Examples of poor content choreography
    • Master Data Management prevents unnecessary duplication
    • The importance of marking up information with metadata
    • Metadata specification makes your data more standardized and interchangeable
    • Controlled vocabulary
    • Folksonomy
    • Architecture using APIs and open data
    • Public APIs, open data and the PSI Act
    • Background to the European Union’s PSI Act
    • Some take issue with the PSI Act – cumbersome access to data
    • What then is open data?
    • The benefits of an API for a startup business or when building anew
    • Design a public API with the developers’ experience in mind
    • Friendly terms and a free license
    • No surprising the developers with unforeseen breaking changes
    • Provide data in the expected format and in suitable bundles
    • Error handling and dimensioning of the service
    • Provide code samples and showcase success stories
    • Promote via data markets and API directories
    • What is the quality of data needed?
    • Microdata – semantically defined content
    • So, what is the problem?
    • The potential of semantic information
    • Microdata standards such as Schema.org and Microformats
    • Digital Asset Management (and Adaptive Content)
    • Adaptive Content
    • Image and media banks in your publishing system
    • Personalization of information
    • URL strategy for dummies
    • Common excuses for breaking established URLs
    • Ok, how to then?
  • Web design
    • Gov.uk design principles
    • Start with needs
    • Do less
    • Design with data
    • Do the hard work to make it simple
    • Iterate. Then iterate again.
    • Build for inclusion
    • Understand context
    • Build digital services, not websites
    • Be consistent, not uniform
    • Make things open: it makes things better
    • Keep it simple, stupid – KISS
    • Do not break the web
    • Persuasive web designs (PWD) – design that convinces
    • Be clear in everything
    • Be very careful of what is the default setting
    • Visual hierarchy is important
    • Focus on the common goal you and your visitor have
    • Try not to overexert your users’ attention
    • Responsive web design
    • The mobile moment
    • The elements of responsive web design
    • Arguments for responsive web design
    • Notes on responsive construction
    • Responsive typography
    • RESS – Responsive Server Side
    • Adaptive web design
    • Design with data – a data first-approach
    • Get started with design with data
    • What you know about your visitors
    • Continuous A / B testing
    • Examples of A / B tests for monitoring the website, and other communications
    • Mobile first
    • Mobile first vs. responsive web
    • The mobile opportunity
    • Mobile restrictions
    • The mobile moment – when mobile users are in the majority
    • SPA – Single Page Application
    • Design of SPA websites
    • Challenges of SPA
    • Web standards, and usability
    • Progressive enhancement and graceful degradation
    • Usability vs. accessibility
    • Gamified design
    • Design and plan for errors that will occur
    • Your website is a magazine, not a book!
  • Web performance
    • Planning for the unplanned
    • Performance optimization of databases, web servers and content management systems
    • General troubleshooting
    • Planning for high load – use cache!
    • Content Networks (CDN – Content Delivery Network)
    • Databases
    • Web servers, content management, own source code and external dependencies
    • Measuring and improving interface performance from the user’s perspective
    • Helpful tools
    • Editorial performance impact
    • Technical settings for performance
    • Recoup an investment in web performance – is it possible?
  • Test your own website
    • How to document your test
    • SEO
    • Indexable for search engines
    • Duplicate content
    • Page title’s length is under 60 characters
    • Page title is readable and understandable in the search engine results page
    • Page title contains relevant keywords that describe the page
    • Correct headings are used
    • Search engine friendly URLs
    • Descriptive text on all important pages
    • Reasonable number of links
    • Pictures have alternative texts
    • Structured description of the information
    • Web analytics
    • Current visitor tracking scripts
    • Tracks the use of website search
    • Performance
    • Reasonable time for loading the page
    • Compression of text files
    • Usage of the browser cache
    • Scripts and style sheets are sent in a compact format
    • Images are optimized for fast transfer
    • Reasonable number of background images, scripts and stylesheets
    • Requesting files and pages that do not exist
    • Minimal amount of scripts and CSS in page code
    • Images are not scaled down using CSS or HTML
    • Identical files are not referenced
    • Reasonable amount of scripts in the page head
    • Content networks are used when necessary
    • Accessibility and Usability
    • Website validates the chosen code standard
    • Using correct header structure
    • Anchor-texts are descriptive
    • Link titles not used for non-essential information
    • Favorite icon is present
    • Possible to navigate with keyboard
    • Texts are written to be read by a human – not with exaggerated SEO
    • Language set in the source code
    • Not depending on browser features
    • Specifies image sizes in HTML
    • Works with and without the www prefix
    • Only one domain is used for the website
    • RSS subscriptions can be detected
    • Useful error pages
    • No surprises when scrolling
    • Enough distance between links, buttons, etc.
    • Acceptable text size
    • Zoomable, also on mobile
    • Icons for the website
    • Useable printouts
    • Others
    • Forms and other sensitive information is sent through a secure channel
  • Tips on in-depth reading
  • Sources & references
  • Thanks goes out to…

Check out the Web Strategy for Everyone by the publisher Intranätverk. It costs from 10 $ + VAT ›

The post Web Strategy for Everyone released today appeared first on Web Strategy for Everyone.

]]>
http://webstrategyforeveryone.com/web-strategy-everyone-released-today/?pk_campaign=feed&pk_kwd=web-strategy-everyone-released-today/feed/ 0