If you manage a large web presence, you will know one of the most difficult jobs is to be able to keep track of everything you have online, and see whether your sites are improving their performance against your key targets – or actually going the other way.
Most organisations are thinking about – and are somewhere on the road to achieving – web governance.
Put simply, web governance involves establishing a framework that contains web policies and a set of standards that need to be enforced across a web presence. All organisations are different, so details will change, but the important thing is to have a policy and stick to it. Large organisations can ensure this happens by monitoring their web presence, perhaps using outside assistance, and targeting priority fixes.
Governisation from Sitemorse allows its users to track their own key priorities simply and effectively and without wasting any time. The pre-agreed targets and priorities are constantly monitored and the user can access through a personalised dashboard.
You can get some idea of how this works by looking at this 40-second video on the Governisation website.
After log in the user is alerted to the number of pages in their web presence achieving pre-agreed standards. In the fictional example used, 87 per send of pages met the pre-agreed standards. An easy-to-understand management summary then points out the areas that need fixing first.
If you would like to find out more about Governsation, and how it could help you keep track of your web presence, contact us and we will be happy to arrange a full demo. The Governisation website explains more about the system and how it works.
More than fifteen years after the internet began to be a mass-market experience there are no longer any excuses for links that don’t work, or pages that do not have titles.
Yet in a recent survey of the top 500 FTSE companies, Sitemorse still found well over two per cent of web pages that did not have a title, and well over 3 per cent failing basic functional tests.
Since a company’s website is the first port of call for virtually all users nowadays, missing images and poor links can give a poor initial impression. After all, if an organisation’s website is put together in a slapdash fashion, what does that say about the business itself?
Google and other search engines may not properly catalogue or index a site that contains HTML errors, and that can mean less users finding what they are looking for – and in the case of e-commerce sites, perhaps a failure of sales and the consequential hit to the company’s bottom line.
Around a quarter of users, according to recent research, will duck out of an online sale because of technical issues. A massive 82% of consumers said that if a business’ website performed badly it would dissuade them from buying goods from that organisation on the web – or even in- store.
Yet recent Sitemorse benchmarks show many online retailers either do not know, or choose to ignore this, with some of the best-known high street names performing very badly on quality issues.
Changes at Google mean that stale website content will be penalised – and that means organisations wanting a high ranking on Google may need to change their ways.
The latest update is designed to analyse whether a person wants up-to-date results or historical data.And Google estimates the alterations to its core ‘algorithm’ would make a difference to about 35% of searches.
The update to improve the “freshness” of results builds on a large-scale update made to the underlying infrastructure of Google’s core indexing system in August 2010 known as Caffeine, making it easier for Google to keep its index up to date and to add new sources of information.
Web analysts have described the changes as “huge”. The last big update to the Google algorithm, known as Panda, affected only 12% of searches.
Google Panda was built through an algorithm update that used artificial intelligence in a more sophisticated and scalable way than previously possible. Human quality testers rated thousands of websites based on measures of quality, including design, trustworthiness, speed and whether or not they would return to the website.
Google’s new Panda machine-learning algorithm, made possible by and named after engineer Navneet Panda was then used to look for similarities between websites people found to be high quality and low quality.
Many new ranking factors have been introduced to the Google algorithm as a result, while older ranking factors like PageRank have been downgraded in importance.
One thing is clear; those websites featuring good, regularly-updated copy will be rated more highly by Google. And those that do not will be less visible on the search engine.
If you think every move you make online can’t be watched, you could be living in Cloud Cuckoo-Land, as this scary video from Privacy International shows: what did you do at lunchtime?