I have worked in the information profession for over twenty years and have been a freelance consultant since 1989. My company (RBA Information Services) provides training and consultancy on the use of the Internet, and on accessing and managing information resources. Prior to setting up RBA I worked at the Colindale Central Public Health Laboratory, and then spent ten years in the Pharmaceutical and Health Care industry before moving to the International management consultancy group Strategic Planning Associates.
I edit and publish an electronic newsletter called Tales from the Terminal Room. Other publications include Search Strategies for the Internet.
I am a Fellow of CILIP: The Chartered Institute of Library and Information Professionals, a member of the UK eInformation Group (UKeiG).
The contract for our domestic electricity supply is ending next month so I am trawling through cost comparison and energy supplier websites to check tariffs for our next contract. (UK readers can skip the rest of this explanatory paragraph). I don’t know what the situation is in other countries but in the UK the gas and electricity suppliers are forever inventing a variety of tariffs priced significantly less than their “standard” rates to entice you to sign up. The lower priced tariffs are generally only available for a year, or two years at most. At the end of the contract the customer is usually transferred to the more expensive standard rate unless they actively seek out an alternative. The existing supplier is obliged to inform the customer of the new tariffs that will be on offer but the onus is on the customer to inform the company which tariff, if any, they wish to switch to. For other suppliers’ tariffs the customer has to do their own research.
Price comparison sites are a good starting point to identify potential alternatives but the only way to check that the a tariff meets all of your criteria, of which price may be just one of many, is to go direct to the supplier’s website. Today I spent most of the morning drawing up the shortlist.
The next step in my strategy was to look at customer reviews on the comparison websites, social media, discussion boards and to run a Google search on each supplier. The reviews and comments generally spanned several years and while the history of a company’s customer service performance can be useful it is the last 12-18 months that are most relevant. This is where limiting the search to more recent information by using Google’s date option comes into play. Having spent an hour or so to get this far, and with my brain beginning to wilt, it was tempting to read just the Google snippets for the reviews; but they can convey the wrong overall impression. Google sometimes creates snippets by pulling together text from two or more sections of a page that may be separated by several paragraphs and which may be about completely different products or topics. Never take the snippet at face value and always click through to the original, full article.
One of the energy providers on my short list is Robin Hood Energy, which is a not-for profit company run by Nottingham City Council and has only recently been made available to customers outside of Nottingham. Customer reviews are therefore less plentiful than for many of the other utilities. The results from a search on
Robin Hood Energy customer reviews
included one from Simply Switch. Underneath the title and URL is a star rating of 4.4 from 221 reviews and one could be forgiven for assuming that this refers to Robin Hood Energy. This is reinforced by the text in the second half of the snippet: “Robin Hood guarantee their customers consistently low prices … rated 4.4/5 based on 221 reviews”.
The dots are important in that they represent a missing chunk of text between the two pieces of information. When I looked at the web page itself the rating was nowhere to be found in the main body of the text. It was in the footer of the page and referred to the Simply Switch site.
A reminder, then, to never rely on the snippets for an answer, and always click through and read the whole web page.
Google Blogger has done it again. A major update to the service was rolled out at the end of September and many users woke up to find that the links and blog lists they had so carefully created had gone. See the Blogger Help Forum for some of the postings and comments on the incident. Blogger engineers are supposedly working to restore the lost information but it “may take up to several days.” Or never! This is not the first time that blog content has gone missing after an update. A few years ago an update somehow removed the most recent posts from people’s blogs. Most of them were eventually recovered but a few disappeared without trace.
The lesson learned from that experience was back up your blog. In Blogger the import and backup tool is under Settings, Other and at the top of the page. Note, though that this will only backup the text of pages, posts and comments. It does not backup any changes you have made to the template, or the content of the gadgets in your sidebars such as links lists and blogrolls. For the template click on Template in the lefthand sidebar and then on Backup/Restore. This will save the general layout of the gadgets but not the content. For that you will need to copy and save the content for each gadget or save a copy of the content and HTML of your blog. Back up your Blogger blog: photos, posts, template, and gadgets has details of what you need to do.
If you don’t have a copy of your lists of links then see if you can access an older cached version of your blog via Google or Bing and save the whole page, or take screen shots. If you try this several days after the event you may be out of luck. Mine were still in the cached page for up to 2 days but have now gone. In Google, use the ‘cache:’ command, for example:
An alternative is to search for your blog and next to your entry in the results lists there should be a small downward pointing green arrow. Click on it and then on the ‘Cached’ text to view the page. This works in both Google and Bing and, again, the sooner you do this the better.
If none of that works then try the Wayback Machine. Type in the URL of your blog and see if they have any snapshots.
Still no joy? Then either hang around a while longer to see if the Blogger engineers manage to revive your lists or start rebuilding them from scratch. If you haven’t looked at them in a while, maybe now is the time to review the content anyway.
This is the list of Top Tips that delegates attending the UKeiG workshop on 7th September 2016 in London came up with at the end of the training day. Some of the usual suspects such as the ‘site:’ command, Carrot Search and Offstats are present but it is good to see Yandex included in the list for the first time.
Carrotsearchhttp://search.carrotsearch.com/carrot2-webapp/search or http://carrotsearch.com/ and click on the “Live Demo” link on the left hand side of the page.
This was recommended for its clustering of results and also the visualisations of terms and concepts via the circles and “foam tree”. The Web Search uses eTools.ch for the general searches and there is also a PubMed option.
Yandexhttp://www.yandex.com/ The international version of the Russian search engine with a collection of advanced commands – including a proximity operator – that makes it a worthy competitor to Google. Run your search and on the results page click on the two line next to search box.
There are country versions of Yandex for Russia, Ukraine, Belarus, Kazakhstan and Turkey. You will, though, need to know the languages to get the best out of them and apart from Turkey they use a different alphabet.
If you are fed up with seeing the same results from Google again and again give MillionShort a try. MillionShort enables you to remove the most popular web sites from the results. The page that best answers your question might not be well optimised for search engines or might cover a topic that is so specialised that it never makes it into the top results in Google or Bing.Originally, as its name suggests, it removed the top 1 million but you can change the number that you want omitted. There are filters to the left of the results enabling you to remove or restrict your results to ecommerce sites, sites with or without advertising, live chat sites and location. The sites that have been excluded are listed to the right of the results.
site: command Use the site: command to focus your search on particular types of site, for example include site:ac.uk in your search for UK academic websites. Or use it to search inside large rambling sites with useless navigation, for example site:www.gov.uk. You can also use -site: to exclude individual sites or a type of site from your search. All of the major web search engines support the command.
Microsoft Academic Searchhttp://academic.research.microsoft.com/ An alternative to Google Scholar.“Semantic search provides you with highly relevant search results from continually refreshed and extensive academic content from over 80 million publications.”This was recently revamped and although it now loads and searches faster than it used to the new version has lost the citation and co-author maps that were so useful. It can be a useful way of identifying researchers, publications and citations but do not rely on the information too much. It can get things very wrong indeed. For example, I’ve found that for some reason the affiliation of several authors from the Slovak Technical University in Bratislava is given as the Technical University of Kenya!
Wolfram Alphahttps://www.wolframalpha.com/ This is very different from the typical search engine in that it uses its own curated data. Whether or not you get an answer from it depends on the type of question and how you ask the question. The information is pulled from its own databases and for many results it is almost impossible to identify the original source, although it does provide a possible list of resources. If you want to see what WolframAlpha can do try out the examples and categories that are listed on its home page.
OFFSTATS – The University of Auckland Libraryhttp://www.offstats.auckland.ac.nz/ This is a great starting point for locating official statistical sources by country, region or subject. All of the content in the database is assessed by humans for quality and authority, and is freely available.
Meltwater IceRockethttp://www.icerocket.com/ IceRocket specialises in real-time search and was recommended for inclusion in the Top Tips for its blog search and advanced search options. There is also a Trends tool that shows you the frequency with which terms are mentioned in blogs over time and which enables you to compare several terms on the same graph.
Very useful for comparing, for example, mentions of products, companies, people in blogs.
Behind the Headlines NHS Choiceshttp://www.nhs.uk/news/Pages/NewsIndex.aspx Behind the headlines provides an unbiased and evidence-based analysis of health stories that make the news. It is a good source of information for confirming or debunking the health/medical claims made by general news reporting services, including the BBC. For each “headline” it summarises in plain English the story, where it came from and who did the research, what kind of research it was, results, researcher’s interpretation, conclusions and whether the headline’s claims are justified.
A couple of weeks ago I wrote about the problems I was having with Google Verbatim (Google Verbatim on the way out?). This morning I ran through a checklist of commands that I am demonstrating in a webinar and it seems that Verbatim is back working as it should. Don’t hold your breath, though. Three times this year I have seen Google Verbatim disappear or do strange things and a couple weeks later return to normal. Verbatim may be here to stay or it may not, but you cannot depend on many advanced search commands to always work as you expect. So either learn different ways of making Google treat your search in the way you require or use a different search engine.
Unfortunately, disappearing or unreliable functionality is not confined to just Google. Bing used to have a very useful proximity command that allowed you to specify how close you wanted your words to be to one another. The “near:n” operator is still listed in Bing’s list of advanced search commands and, although it seems to do something and reduce the number of results, it does not behave as described.
There is also the endangered list such as DuckDuckGo’s sort by date option. In fact all of DuckDuckGo’s web search options will probably soon change or disappear as it is currently powered by Yahoo! which has been bought by Verizon. Who will DuckDuckGo turn to if Verizon does combine Yahoo with AOL as has been stated in the press?
Get to know several different search tools really well and, for the ones that you use regularly, find out how they work and who provides the search results.
Update: 1st September 2016 – Verbatim seems now to be working as it should. I hope it stays that way but on three occasions this year I have seen it work one day, then not the next and then back to working again.
We have become accustomed to Google rewriting and messing about with our searches, and dropping search features that are infrequently used. The one option that could control most of Google’s creative interpretation of our queries was Verbatim but that now looks as though it could be destined for the axe as well.
A reminder of what Verbatim does. If you want to stop Google looking for variations on your terms, ignoring double quote marks around phrases, or dropping words from the search Verbatim is, or rather was, the quickest way to do it. If you are using a desktop or a laptop computer, run your search as normal. On the results page click on ‘Search Tools’ at the end of the line of options that appears at the top. Then, from the second line of options that should appear, choose ‘All results’ followed by Verbatim. The location of Verbatim on other devices varies.
Verbatim has been invaluable when searching on titles of research papers, legislation or researching topics for which you expect or want to retrieve very few or zero results. You might be researching rare adverse events associated with a pharmaceutical drug or wanting to confirm that what you are about to patent has not already been published and is out there for all to see. Or the topic is so specific that you only expect to see a handful of documents, if that. So, sometimes, no or a low number of results is a good thing. But Google does not like zero or small numbers of results and that is when Google’s search rewrite goes into overdrive.
I had noticed for a few months that Verbatim was not always working as expected but had hoped it was one of Google’s experiments. The problem has not gone away and the really confusing part is that Verbatim is still doing something but not what I would expect.
I was working in Penryn in July and took the opportunity to wander around the place. Inevitably, I googled some of the sites I had seen for further information but one threw up the Verbatim problem. I was particularly interested in what looked like a memorial but didn’t have time to seek out information on site. Looking at the photo afterwards I can where the plaque was (to the right and next to the flagpole) but I missed it on the day.
I did see a sign on the wall surrounding the area, though, telling me that it was “two” on the Penryn Heritage Trail.
A quick, basic search told me that it is called the Memorial Garden but I wanted to find out more. I searched on Penryn memorial garden heritage trail.
This gave me 15,900 results but Google had decided to leave out Penryn so I was seeing plenty of information about heritage trails but they were not all in Penryn. I prefixed Penryn with intext: to force Google to include it in the search but then the word heritage was dropped. I applied Verbatim to the search without the intext: command.
This gave me 732 results but even though I had applied Verbatim Google had dropped ‘memorial’ from the search. I prefixed memorial with intext: and got 1230 results with little change to the top entries. And no, I have no idea why there are more hits for this more specific search. I can only assume that other terms were omitted but I was not seeing that in my top 50. I then did what I should have done right from the start and searched on Penryn, and “memorial garden” and “heritage trail” as phrases. When Verbatim was applied this came back with 22 results but no detailed information about the garden. I started to tweak the search terms a little more. Verbatim would drop one, I would ‘intext:’ them and they were then included but I began to suspect that I was being too specific. So I dropped “heritage trail” from the search and cleared Verbatim: 19,300 results with all of the top entries being relevant and informative.
This emphasises that it often pays to keep your search simple, and I mean really simple. Including too many terms, however relevant you may think they are, can be counter-productive. I would have realised earlier that my strategy was too complex had Verbatim behaved as I assumed it would and it had included all of my terms with no variations or omissions.
I ran a few of my test searches to see if this is now a regular feature. One was:
prevalence occupational asthma diagnosis agriculture UK
The results came back as follows:
Ordinary search – prevalence missing from some of the documents, 1,750,000 results
Verbatim search – diagnosis and agriculture missing from some of the documents, 15,300 results
Verbatim with quote marks around missing terms – same results as plain Verbatim with diagnosis and agriculture still missing
Verbatim search but prefixing missing terms with intext:, 14,200 results
I changed the search slightly to:
incidence occupational asthma diagnosis agriculture UK
Some of the results were:
Ordinary search – incidence and agriculture missing from some of the documents, 2,210,000 results
Verbatim search – incidence and agriculture missing, 15,500 results
Ordinary search on intext: incidence occupational asthma diagnosis intext:agriculture UK, 848,000 results
Verbatim intext:incidence occupational asthma diagnosis intext:agriculture UK, 15,000 results
I saw the same pattern with a few other searches. I also tested the searches in incognito mode, and both signed in and signed out of my Google account. There was very little difference in the results and Verbatim behaved in the same way.
It looks as though Verbatim still runs your search without any variations on your terms or synonyms but that it now sometimes chooses to omit terms from some of the documents. To keep those terms in the search you have to prefix them with intext:. Double quote marks around the words are sometimes ignored. This is an unnecessary change and defeats the object of having an option such as Verbatim.
More worrying, though, is that Google obviously thinks Verbatim needs “fixing”. But what it has done is to make the option more difficult to use, which in turn will result in people using it less often than they do already. And if Google sees that use is decreasing it will simply get rid of it altogether. Time to swot up on the few remaining Google commands, or use a different search tool.
It looks as though Google’s daterange: command really has gone for good. Over the last 6 months it has been a case of “now it works, now it doesn’t” but I’ve been testing it regularly over the past couple of months and it seems to have permanently stopped working . People have been reporting the problem in various forums since the start of this year.
So why bother using “daterange:” instead of the date/time option under Search tools? Because the latter does not work with Verbatim. It doesn’t happen often but there are occasions when I need Google to search using my terms exactly as I have typed them without any omissions or variations AND limit the search to a specified time period. The only way to do that was to first run the search with the daterange included in the string and then apply Verbatim to the results.
It is getting to the point where Google is totally useless for advanced, focussed research. What will be next for the chop? filetype? site? If you haven’t done so already, it is time to learn how to use the alternative search tools. Cue blatant plug for my September workshop with UKeiG : Essential non-Google search tools !
Two of the services I cover in my workshop for researchers on alternatives to Google are Carrot Search and eTools.ch, and recently one of the people who had attended the session in April asked me to confirm what Carrot Search used to provide its main results. Strictly speaking, neither Carrot Search nor eTools are Google free: eTools is a metasearch tool that has Google as one of its sources and Carrot Search uses eTools for its web search. At the start of the year, Carrot Search offered 7 options for searching under tabs across the top of the search screen including Web, “wiki”, Bing, News, Images, PubMed and Jobs. Web search used eTools.ch to provide the results.
The range of options has now been reduced to just three: the more transparently labelled eTools Web Search, PubMed and Jobs.
This makes sense as the number of accesses to Bing via the api was always limited and I could never get the news or images options to work. eTools in any case is a metasearch engine covering 17 tools including Google, Bing and Wikipedia so the extra Carrot Search tabs did seem to be unnecessary. The full list can be seen on the eTools home page.
This is where it gets interesting. It appears that Carrot Search does not just copy the results from a search on eTools. I ran a search on Brexit in Carrot Search and compared the results from eTools Worldwide and eTools United Kingdom. All of the sets were different so Carrot Search must be doing some additional analysis and processing.
Carrot Search doesn’t just list the results but also organises them into topics or Folders that are displayed on the left hand side of the screen. These can be a useful way of narrowing down your search.
Carrot Search offers two other ways of displaying results: Circles and Foam Tree.
Both show the density of terms in the top 100 results and allow you to click on an area to add the term or phrase to the search. In addition I am finding that the Foam Tree is an interesting way of monitoring changes in news coverage and social media discussions on a topic, product or company. Yesterday, when I ran the search on Brexit, there was an area representing Theresa May. Today, that had been replaced with one for David Cameron. I assume that is because the news coverage has been concentrating on David Cameron’s last day as Prime Minister and his last Prime Minister’s Questions (PMQ) in Parliament . Later he goes to see the Queen to officially resign as Prime Minister. Tomorrow, with Theresa May as our new Prime Minister and a new Cabinet, the Foam Tree could have a very different structure so I shall be looking at it periodically to see if and how it reflects changes in events.
As I mentioned earlier eTools.ch, which is behind the main Carrot Search web search, is a metasearch engine covering 17 tools. It also has options to select a country from a drop down list (Worldwide, Swtzerland, Liechtenstein, Germany, Austria, France, Italy, Spain, UK) and a language (All, English, German, French, Italian, Spanish). Either or both of these give you completely different views and opinions on a subject.
It is a convenient way of gathering a range of foreign language information, especially on European events, and is easier than searching individual country versions of Google or Bing. The disadvantages are that the range of countries and languages is limited and many of the articles will not be in English. Nevertheless, I often find it helpful at the start of a piece of research as I get a general feel for the type and range of information that is available.
Carrot Search and eTools.ch are just two of the tools that I cover in my workshop on alternatives to Google. If you are interested in finding out more, the next session is being organised by UKeiG and will be held in London on Wednesday, 7th September 2016. Further details are available on the UKeiG website.
Some of the slides that I used as part of my June 2016 workshops on Business Information are now available on both SlideShare and authorSTREAM. The workshop run in the last week of June inevitably included a session on the EU referendum and the Brexit result. A few of those extra slides are included in this edited version of the presentation.
If you have attended one of my recent search workshops, or glanced through the slides, you will have noticed that I have a new test query: the height of Ben Nevis. It didn’t start out as a test search but as a genuine query from me. A straightforward search, I thought, even for Google.
I typed in the query ‘height of ben nevis’ and across the top of the screen Google emblazoned the answer: 1345 metres. That sort of rang a bell and sounded about right, but as with many of Google’s Quick Answers there was no source and I do like to double or even triple check anything that Google comes up with.
To the right of the screen was a Google Knowledge Graph with an extract from Wikipedia telling me that Ben Nevis stands at not 1345 but 1346 metres above sea level. Additional information below that says the mountain has an elevation of 1345 metres and a prominence of 1344 metres (no sources given). I know have three different heights – and what is ‘prominence’?
After a little more research I discovered that prominence is not the same as elevation, but I shall leave you to investigate that for yourselves if you are interested. The main issue for me was that Google was giving me at least three slightly different answers for the height of Ben Nevis, so it was time to read some of the results in full.
Before I got around to clicking on the first of the two articles at the top of the results, alarm bells started ringing. One of the metres to feet conversions in the snippets did not look right.
So I ran my own conversions for both sets of metres to feet and in the other direction (feet to metres):
1344m = 4409.499ft, rounded down to 4409ft
4406ft = 1342.949m, rounded up to 1343m
1346m = 4416.01ft, rounded down to 4416ft
4414ft = 1345.387m, rounded down to 1345m
As if finding three different heights was not bad enough, it seems that the contributors to the top two articles are incapable of carry out simple ft/m conversions, but I suspect that a rounding up and rounding down of the figures before the calculations were carried out is the cause of the discrepancies.
The above results came from a search on Google.co.uk. Google.com gave me similar results but with a Quick Answer in feet, not metres.
We still do not have a reliable answer regarding the height of Ben Nevis.
Three articles below the top two results were from BBC News, The Guardian and Ordnance Survey – the most relevant and authoritative for this query – and were about the height of Ben Nevis having been remeasured earlier this year using GPS. The height on the existing Ordnance Survey maps had been given as 1344m but the more accurate GPS measurements came out at 1344.527m or 4411ft 2in. The original Ordnance Survey article explains that this is only a few centimetres different from the earlier 1949 assessment but it means that the final number has had to be rounded up rather than down. The official height on OS maps has therefore been increased from 1344m to 1345m. So Google’s Quick Answer at the top of the results page was indeed correct.
Why make a fuss about what are, after all, relatively small variations in the figures? Because there is one official height for the mountain and one of the three figures that Google was giving me (1346m) was neither the current nor the previous height. Looking at the commentary behind the Wikipedia article, which gave 1346m, it seems that the contributors were trying to reconcile the height in metres with the height in feet but carrying out the conversion using rounded up or rounded down figures. As one of my science teachers taught me long ago, you should always carry forward to the next stage of your calculations as many figures after the decimal point as possible. Only when you get to the end do you round up or down, if it is appropriate to do so. And imagine if your Pub Quiz team lost the local championship because you had correctly answered 1345m to this question but the MC had 1346m down as the correct figure? There’d be a riot if not all out war!
That’s what Google gave us. How did Bing fare?
The US and UK versions of Bing gave results that looked very similar to Google’s but with two different quick answers in feet, and neither gave sources:
I won’t bore you with all of the other search tools that I tried except for Wolfram Alpha. This gave me 1343 meters or 4406 ft. At least the conversion is correct but there is no direct information on where the data has been taken from.
The sources link was of no help whatsoever and referred me to the home pages of the sites and not the Ben Nevis specific data. On some of the sites, when I did find the Ben Nevis pages, the figures were different from those shown by Wolfram Alpha so I have no idea how Wolfram arrived at 1343 meters.
So, the answer to my question “How high is Ben Nevis?” is 1344.527m rounded up on OS maps to 1345m.
And the main lessons from this exercise are:
Never trust the quick answers or knowledge graphs from any of the search engines, especially if no source is given. But you knew that anyway, didn’t you?
If you are seeing even small variations in the figures, and there are calculations or conversions involved, double check them yourself.
Don’t skim read the results and use information highlighted in the snippets – read the full articles and from more than one source.
Make sure that the articles you use are not just copying what others have said.
One of the services I provide is company research including official registry documents and accounts. Many registries, including, the UK’s Companies House, make a significant amount of their data available free of charge. Some still charge for documents and a few insist that you register before you can even search for a company. If I know the information is freely available I usually point the client at the relevant website but a few people come back to me when they discover that the interface is in a foreign language. If registration and/or payment are required I’m often asked to search on the client’s behalf because they just do not want the hassle of going through the registration process and recouping the cost of a small overseas transaction from their accounts department.
Regardless of whether the information is free or charged for, I often receive what I call a second stage request for more detailed accounts. Why is there no Profit & Loss? Where is the revenue/turnover figure? I then have to explain how the reporting and filing requirements differ depending on the country, or even state; and then I have the joy of taking the client through small company exemptions. Some people I know have only just got their head around the changes introduced by UK Companies Act 2006. I now have to tell them that this has changed yet again.
In March 2015 the UK Government approved new regulations that implement the requirements of the new EU Accounting Directive. The changes came into effect in the UK from 1 January 2016. There are a number of changes, which may reduce yet further the amount of information that small companies are required to provide, and there are also changes to what is deemed to a be a “small” company. Small companies can now be bigger.
A company can now qualify as small if meets at least two of the three following criteria:
turnover not more than £10.2m (previously £6.5m)
balance sheet total not more than £5.1m (previously £3.26m)
average number of employees not more than 50 (no change)
“… the removal of the ability for a small or medium-sized company to file abbreviated accounts with us at Companies House. A company will now be required to file the accounts they prepare for their members at Companies House (although a small company or micro-entity will usually be able to choose not to file their profit and loss account or director’s report).”
“However, this does not mean that all small companies are now required to file full accounts, the very smallest companies may disclose less information by preparing micro-entity accounts. Other small companies may, instead of filing full accounts, choose to prepare a set of abridged accounts for their members and then file these with us.”
So, as well as “small” being allowed to be bigger we now have even smaller companies or “micro-entities” who can choose to disclose less information. The whole thing is beginning to look as clear as mud!