A few weeks ago, I gave a lecture in our Advanced Legal Research course on free and low-cost legal research. This is not a new lecture topic for me. Typically, we focus on Fastcase and Casemaker for the low-cost resources, and Justia, FindLaw, Google Scholar, and government websites (among others) for the free resources. Recently, however, a number of legal research startups have come on the market that are attempting to change traditional legal research in some way. Ravel Law, for instance, approaches legal research through a data visualization lens. What I have found particularly interesting, however, is the trend toward crowdsourcing legal research.
“Crowdsourcing is the process of obtaining needed services, ideas, or content by soliciting contributions from a large group of people, and especially from an online community, rather than from traditional employees or suppliers.” I took this definition from Wikipedia, which seems fitting given that Wikipedia is probably the best-known example of crowdsourcing. The idea behind crowdsourcing legal research is admirable – share the wealth, lighten the burden, collaborate. Crowdsourced legal research can take many forms. For example, while these legal research platforms tend to offer the basic, unannotated primary sources to start with, users might be encouraged to provide annotations to an unannotated case or code section in the form of related cases or secondary sources, as well as relevant legal forms, white papers, etc.
With that in mind, I showed the class Casetext as an example of a crowdsourced legal research platform, demonstrated how the crowdsourcing works, and explained the impetus behind such a platform.* Because legal research is a difficult task to master, I expected the students to be thrilled by the idea of a platform that offers helpful resources previously relied upon by legal practitioners when researching the same topic; instead, they were very skeptical. They had significant ethical concerns about relying on another attorney’s work product instead of finding the resources themselves. One student even suggested that a disgruntled attorney might post privileged materials on the website, hurting both the firm and the client. Of course, such material can be taken down, but as we know, once something is on the internet, rarely does it every truly disappear.
The class reaction to the notion of crowdsourcing legal research got me thinking about the pros and cons of such platforms. To begin with, it should be stressed that these platforms are not suggesting that you should slough off your research by relying entirely on the work of others; rather, they promote collaboration and sharing of resources to aid legal research. The beauty of crowdsourcing is that “[l]arge groups can effectively and accurately solve some tasks better than individuals,” (Wolfson & Lease, 1). Two people researching the same issue will likely come up with many overlapping resources, but are also likely to find a few materials the other did not. Wouldn’t legal research be that much easier if we had access to materials that others had found while researching this same issue? Further research into crowdsourcing resulted in a short list of pros and cons:
- Facilitation of legal research tasks, as stated earlier.
- Reputation-building – Legal research platforms such as Casetext and Jurify** encourage legal research contribution by rewarding contributors of high-quality content with public recognition on the platform (such as through a Top Attorneys list).
- Portfolio-building – Active participation in these platforms can be a tool to reference when on the job market. Your participation might show a potential employer how invested you are in the professional conversation on a particular area of law.
- Quality of contribution – A common issue with crowdsourcing is that anyone can do it, so it can be difficult to know how much reliance to put on any information posted by the “crowd.” These platforms have a system to combat faulty information, asking members of the community to up-vote or down-vote contributed content based on how reliable or relevant you think that content is.
- Data security – These platforms have little control over what users post, but, as my students pointed out, firms and clients may have very real concerns about what contributing attorneys might decide to post on these sites. This issue could arise, as stated earlier, with a disgruntled attorney, but scholars in this area have also noted an ethics-blindness that can occur when users get over-zealous about contributing to the crowd for the greater good without thinking of reasons that it may be imprudent to contribute certain content (Dolmaya, 101-102).
- Risks of anonymity – As we have seen with the rise in cyber-bullying and other related issues, individuals often behave quite differently when their actions are anonymous. This can lead to unethical behavior. Studies have been done in which surveyed crowd-sourcers identified specific online activities as unethical, but, when asked to perform these activities in an anonymous, crowdsourcing environment, many more performed the very activities they earlier claimed to be unethical (Harris & Srinivasan). Fortunately, in the case of these legal research platforms, users are never truly anonymous, so we can expect most participants to refrain from outlandish behavior.
Many scholars have offered good rules of thumb for the use of crowdsourced information. It is best to treat any crowdsourced content as unreliable until it has been verified, either by you or others in the community (Levin). Your level of reliance on crowdsourced content matters as well. It is not wise to rely on any crowdsourced content as a bright-line rule in your research, but using crowdsourced content to begin your research (leading to more tried-and-true resources) could be good strategy (Frankrone, 904). In fact, that was the ultimate conclusion of the Advanced Legal Research students: the class concluded that crowdsourced legal research could be a helpful place to begin your research, but in the end, the person responsible for the veracity and completeness of your research is you.
* Other platforms that tend toward crowdsourcing include Jurify and Mootus (the latter is more for building legal argument skills than legal research).
**Note: Jurify recently announced that it will be shutting down soon due to lack of funding.
Julie McDonough Dolmaya, The Ethics of Crowdsourcing, 10 Linguistica Antverpiensia, 97-111 (2011), available at https://lans-tts.uantwerpen.be/index.php/LANS-TTS/article/view/279/177.
Erin R. Frankrone, Free Agents: Should Crowdsourcing Lead to Agency Liability in Firms, 15 Vand. J. Ent. & Tech. L. 883-911 (2013), available at http://www.jetlaw.org/wp-content/uploads/2013/05/Frankrone.pdf.
Christopher G. Harris & Padmini Srinivasan, Crowdsourcing and Ethics: The Employment of Crowdsourcing Workers for Tasks that Violate Privacy and Ethics, in Security and Privacy in Social Networks (Y. Altshuler et al. eds. 2013).
John Levin, ETHICS 20/20: Third Party Services – Part Two: Crowdsourcing, CBA Record 60 (May 25, 2011).
Stephen M. Wolfson & Matthew Lease, Look Before You Leap: Legal Pitfalls of Crowdsourcing, Conference Proceeding, ASIST (2011), available at https://www.ischool.utexas.edu/~ml/papers/wolfson-asist11.pdf.