← Return to search results
Back to Prindle Institute

The Problem with “Google-Research”

photograph of computer screen with empty Google searchbar

If you have a question, chances are the internet has answers: research these days tends to start with plugging a question into Google, browsing the results on the first (and, if you’re really desperate, second) page, and going from there. If you’ve found a source that you trust, you might go to the relevant site and call it a day; if you’re more dedicated, you might try double-checking your source with others from your search results, maybe just to make sure that other search results say the same thing. This is not the most robust kind of research – that might involve cracking a book or talking to an expert – but we often consider it good enough. Call this kind of research Google-research.

Consider an example of Google-researching in action. When doing research for my previous article – Permalancing and What it Means for Work – I needed to get a sense of what the state of freelancing was like in America. Some quick Googling turned up a bunch of results, the following being a representative sample:

‘Permalancing’ Is The New Self-Employment Trend You’ll Be Seeing Everywhere

More Millennials want freelance careers instead of working full-time

Freelance Economy Continues to Roar

Majority of U.S. Workers Will be Freelancers by 2027, Report Says

New 5th Annual “Freelancing in America” Study Finds That the U.S. Freelance Workforce, Now 56.7 Million People, Grew 3.7 Million Since 2014

While not everyone’s Googling will return exactly the same results, you’ll probably be presented with a similar set of headlines if you search for the terms “freelance” and “America”. The picture that’s painted by my results is one in which the state of freelance work in America is booming, and welcome: not only do “more millennials want freelance careers,” but the freelance economy is currently “roaring,” increasing by millions of people over the course of only a few years. If I were simply curious about the state of freelancing in America, or if I was satisfied with the widespread agreement in my results, then I would probably have been happy to accept the results of my Google-researching, which tells me that the status of freelancing in America is not only healthy, but thriving. I could, of course, have gone the extra mile and tried to consult an expert (perhaps I could have found an economist at my university to talk to). But I had stuff to do, and deadlines to meet, so it was tempting to take these results at face value.

While Google-researching has become a popular way to do one’s research (whenever I ask my students how they would figure out the answer to basically any question, for example, their first response is invariably that they Google it), there are a number of ways that it can lead one astray.

Consider my freelancing example again: while the above headlines generally agree with each other, there are reasons to be worried about whether they are conveying information that’s actually true. One problem is that all of above articles summarize the results of the same study: the “Freelancing in America” study, mentioned explicitly in the last headline. A little more investigating reveals some troubling information about the study: in addition to concerns I raised in in my previous article – including concerns about the study glossing over disparities in freelance incomes, and failing to distinguish between the earning potentials and difference in number of jobs across different types of freelance work – the study itself was commissioned by the website Upwork, which describes itself as a “global freelancing platform where businesses and independent professionals connect and collaborate.” Such a site, one would think, has a vested interest in presenting the state of freelancing as positively as possible, and so we should at the very least take the results of the study with a grain of salt. The articles, however, merely present information from the study, but do little in the way of quality control.

One worry, then, is by merely Google-researching the issue I can end up feeling overly confident that the information presented in my search results is true: not only is the information I’m reading being presented uncritically as fact, all my search results agree with and support one another. Part of the problem lies, of course, with the presentation of the information in the first place: while it may be the case that I should take these articles with a grain of salt, it seems that by the way the above articles were written, the various websites and news outlets that presented the information in such a way that they took the results of the study at face value. As a result, although it was almost certainly not the intention of the authors of the various articles, they end up presenting misleading information.

The phenomenon by which journalists reports on studies by taking them at face value is unfortunately commonplace in many different areas of reporting. For example, writing on problems with science journalism, philosopher Carrie Figdor argues that since “many journalists take, or frequently have no choice but to take, a stance toward science characteristic of a member of a lay community,” they do not possess the relevant skills required to determine whether the information that they’re presenting is true, and cannot reliably distinguish between those studies that are worth reporting on and which are not. This, Figdor argues, does not necessarily absolve journalists of blame, as they are at least partially responsible for choosing which studies to report on: if they choose to report on a field that is not producing reliable research, then they should “not [cover] the affected fields until researchers get their act together.”

So it seems that there are at least two major concerns with Google-research: the first comes relates to the way that information is presented by journalists – often lacking the specialized background that would help them better present the information they’re reporting on, journalists may end up presenting information that is inaccurate or misleading. The second is with the method itself – while it may sometimes be good enough to do a quick Google and believe what the headlines say, oftentimes getting at the truth of an issue requires going beyond the headlines.