Voting in the UK election has finished and the results are in, but the dust has most definitely not settled. It looks as we in the UK are in for interesting times ahead. It would help those of us researching the various political parties and policies if Google could at least get the basics right, such as who is now the Member of Parliament for a particular constituency. I am in Reading East and we have switched from a Conservative MP to Labour (Matt Rodda). Out of curiosity, I tried a search in Google on Reading East constituency. This is what Google’s Knowledge Graph came up with:
I took this screenshot yesterday (Friday, 9th June) at around 8 a.m. and expected to see Rob Wilson given as the MP throughout . I was impressed, though, to see that the snippet from Wikipedia correctly gives Matt Rodda as our MP. Whoever had updated that entry was pretty quick off the mark. Possibly a Labour Party worker? The rest of the information, which is taken from Google’s database of “facts”, is either wrong, confusing or nonsensical.
“Member of Parliament: Rob Wilson” – wrong. But he was MP until around 4 a.m. on the 9th June when the result of the election in Reading East was announced, so perhaps I am expecting a little too much from Google to be that quick about updating its facts.
“Major settlement: Reading” – yes we are part of Reading but I find it strange that it is referred to as a major settlement rather than a town.
“Number of members: 1” – not sure why that is there as each constituency can only have one MP.
“Party: Conservative” – correct for Rob Wilson but the new MP is Labour.
“European Parliament constituency: South East England” – correct!
The final two lines “Replaced by:” and “Created from:” had me totally flummoxed. The entries are the same – Reading North, Reading South, Henley. Reading North and Reading South were constituencies formed by splitting the Reading constituency in 1950. They were then merged back into Reading in 1955, re-created in 1974, and in 1983 Reading East and West were formed (Yes, it’s complicated!). As for Henley, it is not even in the same county. I can only think that this comes from Caversham (now part of Reading East) being part of Oxfordshire until 1911, when it probably did fall within the Henley constituency. The “Replaced by” is wrong because Reading East has not been replaced by anything. Google can’t even blame a template that has to be filled in with information at all costs because different information appears in the Knowledge Graph depending on the constituency.
Here is the information for Aylesbury:
And the one for Guildford:
Going back to the how up to date the information is, how quickly does Google update their “facts”. Rob Wilson was still our MP mid Friday afternoon. I submitted feedback using the link that Google provides at the bottom of each Knowledge Graph but this morning (10th June) nothing had changed. I’ll update this posting when it does change.
I would hope that most people would look at the other links in the search results, in this case the latest news, but preferably a reliable authoritative source. The list of MPs on the UK Parliament website would be an obvious choice but might take a day to be updated after an election. Just don’t rely on Google to get it right.
Ok, we know that Google often does strange things with our searches but much of the time it is not obvious that something odd has happened. There are usually some “good enough” answers scattered through the first 20-30 results so that we shrug off the rest as “well, that’s Google for you”. Occasionally, though, one comes across a search that seems to break Google. One such example was reported on Twitter this morning by Rand Fishkin (@randfish). The search was
this is the best * on the internet
At the top of the first results page Google reported that it had found over a billion results but when @randfish moved to the next page Google showed just “2 of 12 results”! Whatever happened to the other billion or so?
I tried the search myself on my laptop and straightaway got three results but on repeating it that was reduced to two.
I repeated the search having logged out of my Google account, cleared cookies, used Incognito and different browsers. Same results.
I tried a phrase search and the number of hits increased to 17.
Then I removed the quotation marks, got back to my original set of two and ran Verbatim on it. Over a billion hits but, bizarrely, Google claimed to have gone straight page 2!
Note: you normally can’t see the number of results after you have run Verbatim because it is obscured by a second menu line. You can toggle between that menu and the number of hits by clicking on the Tools button.
Then I tried a phrase search followed by Verbatim: two results but different from my first set.
I could have gone on trying various advanced search commands but it is very clear that Google is having problems with this particular search. And, no, I have no idea what is going on here.
Search Engine Roundtable reports today that Google is advising against using the link operator in search. It seems that there have been complaints on Twitter and elsewhere that it is returning some odd results.
I have never been a fan of the command; it only ever returned a small sample of pages that link to a known page, so I don’t mention it in my workshops unless asked about it by one of the participants. When I saw the advice from Google I gave it a final go on my own domain rba.co.uk and got nearly 300,000 hits. “Wow,” I thought, “amazing!” Glancing through the first few results it became obvious that Google had ignored all the punctuation and was running a text search and looking for variations on rba including RBS (Royal Bank of Scotland).
No great loss, but a sign that other more useful operators and commands may be for the chop.
This is my usual Christmas/New Year reminder to never trust Google’s answers (or Bing’s) on opening times of shops over the holiday season, especially if you are thinking of visiting small, local, independent shops.
I was contemplating going to our True Food Co-operative but suspected that it might still be shut. A search on my laptop for True Food Emmer Green opening times gave me a link to their website at the top of the results list. On the right hand side was a knowledge graph with information on the shop, it’s opening times and reviews that had been compiled from a variety of sources . For most of it the source of the information is not given. On my mobile and tablet it is the knowledge graph that appears at the top of the results list and takes up the first couple of screens.
It claims that the shop is “Open today 10am-6pm” [today is Thursday, 29th December].
When I go to True Food’s website it clearly states near the top of the home page that they are currently closed and re-opening on 4th January 2017.
Google gets it wrong again in the knowledge graph but so does Bing. So, always check the shop’s own website, and if you are searching on your mobile or tablet please make the effort to scroll down a couple of screens to get to links to more reliable information.
The contract for our domestic electricity supply is ending next month so I am trawling through cost comparison and energy supplier websites to check tariffs for our next contract. (UK readers can skip the rest of this explanatory paragraph). I don’t know what the situation is in other countries but in the UK the gas and electricity suppliers are forever inventing a variety of tariffs priced significantly less than their “standard” rates to entice you to sign up. The lower priced tariffs are generally only available for a year, or two years at most. At the end of the contract the customer is usually transferred to the more expensive standard rate unless they actively seek out an alternative. The existing supplier is obliged to inform the customer of the new tariffs that will be on offer but the onus is on the customer to inform the company which tariff, if any, they wish to switch to. For other suppliers’ tariffs the customer has to do their own research.
Price comparison sites are a good starting point to identify potential alternatives but the only way to check that the a tariff meets all of your criteria, of which price may be just one of many, is to go direct to the supplier’s website. Today I spent most of the morning drawing up the shortlist.
The next step in my strategy was to look at customer reviews on the comparison websites, social media, discussion boards and to run a Google search on each supplier. The reviews and comments generally spanned several years and while the history of a company’s customer service performance can be useful it is the last 12-18 months that are most relevant. This is where limiting the search to more recent information by using Google’s date option comes into play. Having spent an hour or so to get this far, and with my brain beginning to wilt, it was tempting to read just the Google snippets for the reviews; but they can convey the wrong overall impression. Google sometimes creates snippets by pulling together text from two or more sections of a page that may be separated by several paragraphs and which may be about completely different products or topics. Never take the snippet at face value and always click through to the original, full article.
One of the energy providers on my short list is Robin Hood Energy, which is a not-for profit company run by Nottingham City Council and has only recently been made available to customers outside of Nottingham. Customer reviews are therefore less plentiful than for many of the other utilities. The results from a search on
Robin Hood Energy customer reviews
included one from Simply Switch. Underneath the title and URL is a star rating of 4.4 from 221 reviews and one could be forgiven for assuming that this refers to Robin Hood Energy. This is reinforced by the text in the second half of the snippet: “Robin Hood guarantee their customers consistently low prices … rated 4.4/5 based on 221 reviews”.
The dots are important in that they represent a missing chunk of text between the two pieces of information. When I looked at the web page itself the rating was nowhere to be found in the main body of the text. It was in the footer of the page and referred to the Simply Switch site.
A reminder, then, to never rely on the snippets for an answer, and always click through and read the whole web page.
Google Blogger has done it again. A major update to the service was rolled out at the end of September and many users woke up to find that the links and blog lists they had so carefully created had gone. See the Blogger Help Forum for some of the postings and comments on the incident. Blogger engineers are supposedly working to restore the lost information but it “may take up to several days.” Or never! This is not the first time that blog content has gone missing after an update. A few years ago an update somehow removed the most recent posts from people’s blogs. Most of them were eventually recovered but a few disappeared without trace.
The lesson learned from that experience was back up your blog. In Blogger the import and backup tool is under Settings, Other and at the top of the page. Note, though that this will only backup the text of pages, posts and comments. It does not backup any changes you have made to the template, or the content of the gadgets in your sidebars such as links lists and blogrolls. For the template click on Template in the lefthand sidebar and then on Backup/Restore. This will save the general layout of the gadgets but not the content. For that you will need to copy and save the content for each gadget or save a copy of the content and HTML of your blog. Back up your Blogger blog: photos, posts, template, and gadgets has details of what you need to do.
If you don’t have a copy of your lists of links then see if you can access an older cached version of your blog via Google or Bing and save the whole page, or take screen shots. If you try this several days after the event you may be out of luck. Mine were still in the cached page for up to 2 days but have now gone. In Google, use the ‘cache:’ command, for example:
An alternative is to search for your blog and next to your entry in the results lists there should be a small downward pointing green arrow. Click on it and then on the ‘Cached’ text to view the page. This works in both Google and Bing and, again, the sooner you do this the better.
If none of that works then try the Wayback Machine. Type in the URL of your blog and see if they have any snapshots.
Still no joy? Then either hang around a while longer to see if the Blogger engineers manage to revive your lists or start rebuilding them from scratch. If you haven’t looked at them in a while, maybe now is the time to review the content anyway.
A couple of weeks ago I wrote about the problems I was having with Google Verbatim (Google Verbatim on the way out?). This morning I ran through a checklist of commands that I am demonstrating in a webinar and it seems that Verbatim is back working as it should. Don’t hold your breath, though. Three times this year I have seen Google Verbatim disappear or do strange things and a couple weeks later return to normal. Verbatim may be here to stay or it may not, but you cannot depend on many advanced search commands to always work as you expect. So either learn different ways of making Google treat your search in the way you require or use a different search engine.
Unfortunately, disappearing or unreliable functionality is not confined to just Google. Bing used to have a very useful proximity command that allowed you to specify how close you wanted your words to be to one another. The “near:n” operator is still listed in Bing’s list of advanced search commands and, although it seems to do something and reduce the number of results, it does not behave as described.
There is also the endangered list such as DuckDuckGo’s sort by date option. In fact all of DuckDuckGo’s web search options will probably soon change or disappear as it is currently powered by Yahoo! which has been bought by Verizon. Who will DuckDuckGo turn to if Verizon does combine Yahoo with AOL as has been stated in the press?
Get to know several different search tools really well and, for the ones that you use regularly, find out how they work and who provides the search results.
Update: 1st September 2016 – Verbatim seems now to be working as it should. I hope it stays that way but on three occasions this year I have seen it work one day, then not the next and then back to working again.
We have become accustomed to Google rewriting and messing about with our searches, and dropping search features that are infrequently used. The one option that could control most of Google’s creative interpretation of our queries was Verbatim but that now looks as though it could be destined for the axe as well.
A reminder of what Verbatim does. If you want to stop Google looking for variations on your terms, ignoring double quote marks around phrases, or dropping words from the search Verbatim is, or rather was, the quickest way to do it. If you are using a desktop or a laptop computer, run your search as normal. On the results page click on ‘Search Tools’ at the end of the line of options that appears at the top. Then, from the second line of options that should appear, choose ‘All results’ followed by Verbatim. The location of Verbatim on other devices varies.
Verbatim has been invaluable when searching on titles of research papers, legislation or researching topics for which you expect or want to retrieve very few or zero results. You might be researching rare adverse events associated with a pharmaceutical drug or wanting to confirm that what you are about to patent has not already been published and is out there for all to see. Or the topic is so specific that you only expect to see a handful of documents, if that. So, sometimes, no or a low number of results is a good thing. But Google does not like zero or small numbers of results and that is when Google’s search rewrite goes into overdrive.
I had noticed for a few months that Verbatim was not always working as expected but had hoped it was one of Google’s experiments. The problem has not gone away and the really confusing part is that Verbatim is still doing something but not what I would expect.
I was working in Penryn in July and took the opportunity to wander around the place. Inevitably, I googled some of the sites I had seen for further information but one threw up the Verbatim problem. I was particularly interested in what looked like a memorial but didn’t have time to seek out information on site. Looking at the photo afterwards I can where the plaque was (to the right and next to the flagpole) but I missed it on the day.
I did see a sign on the wall surrounding the area, though, telling me that it was “two” on the Penryn Heritage Trail.
A quick, basic search told me that it is called the Memorial Garden but I wanted to find out more. I searched on Penryn memorial garden heritage trail.
This gave me 15,900 results but Google had decided to leave out Penryn so I was seeing plenty of information about heritage trails but they were not all in Penryn. I prefixed Penryn with intext: to force Google to include it in the search but then the word heritage was dropped. I applied Verbatim to the search without the intext: command.
This gave me 732 results but even though I had applied Verbatim Google had dropped ‘memorial’ from the search. I prefixed memorial with intext: and got 1230 results with little change to the top entries. And no, I have no idea why there are more hits for this more specific search. I can only assume that other terms were omitted but I was not seeing that in my top 50. I then did what I should have done right from the start and searched on Penryn, and “memorial garden” and “heritage trail” as phrases. When Verbatim was applied this came back with 22 results but no detailed information about the garden. I started to tweak the search terms a little more. Verbatim would drop one, I would ‘intext:’ them and they were then included but I began to suspect that I was being too specific. So I dropped “heritage trail” from the search and cleared Verbatim: 19,300 results with all of the top entries being relevant and informative.
This emphasises that it often pays to keep your search simple, and I mean really simple. Including too many terms, however relevant you may think they are, can be counter-productive. I would have realised earlier that my strategy was too complex had Verbatim behaved as I assumed it would and it had included all of my terms with no variations or omissions.
I ran a few of my test searches to see if this is now a regular feature. One was:
prevalence occupational asthma diagnosis agriculture UK
The results came back as follows:
Ordinary search – prevalence missing from some of the documents, 1,750,000 results
Verbatim search – diagnosis and agriculture missing from some of the documents, 15,300 results
Verbatim with quote marks around missing terms – same results as plain Verbatim with diagnosis and agriculture still missing
Verbatim search but prefixing missing terms with intext:, 14,200 results
I changed the search slightly to:
incidence occupational asthma diagnosis agriculture UK
Some of the results were:
Ordinary search – incidence and agriculture missing from some of the documents, 2,210,000 results
Verbatim search – incidence and agriculture missing, 15,500 results
Ordinary search on intext: incidence occupational asthma diagnosis intext:agriculture UK, 848,000 results
Verbatim intext:incidence occupational asthma diagnosis intext:agriculture UK, 15,000 results
I saw the same pattern with a few other searches. I also tested the searches in incognito mode, and both signed in and signed out of my Google account. There was very little difference in the results and Verbatim behaved in the same way.
It looks as though Verbatim still runs your search without any variations on your terms or synonyms but that it now sometimes chooses to omit terms from some of the documents. To keep those terms in the search you have to prefix them with intext:. Double quote marks around the words are sometimes ignored. This is an unnecessary change and defeats the object of having an option such as Verbatim.
More worrying, though, is that Google obviously thinks Verbatim needs “fixing”. But what it has done is to make the option more difficult to use, which in turn will result in people using it less often than they do already. And if Google sees that use is decreasing it will simply get rid of it altogether. Time to swot up on the few remaining Google commands, or use a different search tool.
It looks as though Google’s daterange: command really has gone for good. Over the last 6 months it has been a case of “now it works, now it doesn’t” but I’ve been testing it regularly over the past couple of months and it seems to have permanently stopped working . People have been reporting the problem in various forums since the start of this year.
So why bother using “daterange:” instead of the date/time option under Search tools? Because the latter does not work with Verbatim. It doesn’t happen often but there are occasions when I need Google to search using my terms exactly as I have typed them without any omissions or variations AND limit the search to a specified time period. The only way to do that was to first run the search with the daterange included in the string and then apply Verbatim to the results.
It is getting to the point where Google is totally useless for advanced, focussed research. What will be next for the chop? filetype? site? If you haven’t done so already, it is time to learn how to use the alternative search tools. Cue blatant plug for my September workshop with UKeiG : Essential non-Google search tools !
If you have attended one of my recent search workshops, or glanced through the slides, you will have noticed that I have a new test query: the height of Ben Nevis. It didn’t start out as a test search but as a genuine query from me. A straightforward search, I thought, even for Google.
I typed in the query ‘height of ben nevis’ and across the top of the screen Google emblazoned the answer: 1345 metres. That sort of rang a bell and sounded about right, but as with many of Google’s Quick Answers there was no source and I do like to double or even triple check anything that Google comes up with.
To the right of the screen was a Google Knowledge Graph with an extract from Wikipedia telling me that Ben Nevis stands at not 1345 but 1346 metres above sea level. Additional information below that says the mountain has an elevation of 1345 metres and a prominence of 1344 metres (no sources given). I know have three different heights – and what is ‘prominence’?
After a little more research I discovered that prominence is not the same as elevation, but I shall leave you to investigate that for yourselves if you are interested. The main issue for me was that Google was giving me at least three slightly different answers for the height of Ben Nevis, so it was time to read some of the results in full.
Before I got around to clicking on the first of the two articles at the top of the results, alarm bells started ringing. One of the metres to feet conversions in the snippets did not look right.
So I ran my own conversions for both sets of metres to feet and in the other direction (feet to metres):
1344m = 4409.499ft, rounded down to 4409ft
4406ft = 1342.949m, rounded up to 1343m
1346m = 4416.01ft, rounded down to 4416ft
4414ft = 1345.387m, rounded down to 1345m
As if finding three different heights was not bad enough, it seems that the contributors to the top two articles are incapable of carry out simple ft/m conversions, but I suspect that a rounding up and rounding down of the figures before the calculations were carried out is the cause of the discrepancies.
The above results came from a search on Google.co.uk. Google.com gave me similar results but with a Quick Answer in feet, not metres.
We still do not have a reliable answer regarding the height of Ben Nevis.
Three articles below the top two results were from BBC News, The Guardian and Ordnance Survey – the most relevant and authoritative for this query – and were about the height of Ben Nevis having been remeasured earlier this year using GPS. The height on the existing Ordnance Survey maps had been given as 1344m but the more accurate GPS measurements came out at 1344.527m or 4411ft 2in. The original Ordnance Survey article explains that this is only a few centimetres different from the earlier 1949 assessment but it means that the final number has had to be rounded up rather than down. The official height on OS maps has therefore been increased from 1344m to 1345m. So Google’s Quick Answer at the top of the results page was indeed correct.
Why make a fuss about what are, after all, relatively small variations in the figures? Because there is one official height for the mountain and one of the three figures that Google was giving me (1346m) was neither the current nor the previous height. Looking at the commentary behind the Wikipedia article, which gave 1346m, it seems that the contributors were trying to reconcile the height in metres with the height in feet but carrying out the conversion using rounded up or rounded down figures. As one of my science teachers taught me long ago, you should always carry forward to the next stage of your calculations as many figures after the decimal point as possible. Only when you get to the end do you round up or down, if it is appropriate to do so. And imagine if your Pub Quiz team lost the local championship because you had correctly answered 1345m to this question but the MC had 1346m down as the correct figure? There’d be a riot if not all out war!
That’s what Google gave us. How did Bing fare?
The US and UK versions of Bing gave results that looked very similar to Google’s but with two different quick answers in feet, and neither gave sources:
I won’t bore you with all of the other search tools that I tried except for Wolfram Alpha. This gave me 1343 meters or 4406 ft. At least the conversion is correct but there is no direct information on where the data has been taken from.
The sources link was of no help whatsoever and referred me to the home pages of the sites and not the Ben Nevis specific data. On some of the sites, when I did find the Ben Nevis pages, the figures were different from those shown by Wolfram Alpha so I have no idea how Wolfram arrived at 1343 meters.
So, the answer to my question “How high is Ben Nevis?” is 1344.527m rounded up on OS maps to 1345m.
And the main lessons from this exercise are:
Never trust the quick answers or knowledge graphs from any of the search engines, especially if no source is given. But you knew that anyway, didn’t you?
If you are seeing even small variations in the figures, and there are calculations or conversions involved, double check them yourself.
Don’t skim read the results and use information highlighted in the snippets – read the full articles and from more than one source.
Make sure that the articles you use are not just copying what others have said.