Friday, November 21, 2014

Does Your Database Need A Law Degree?

While contemplating the major legal database algorithms recently, I realized that the current generation of searchers might rely too heavily on the results without really understanding how those results are generated. 

There could be a point in time when the algorithms become sophisticated enough with an artificial intelligence (AI) to rely on them more in this way. Especially because algorithms tend to get stronger over time by relying on external cites and the number of clicks, for example, to generate results. 

But we are not there, yet. For example, with a current natural language search, the databases may look for synonyms for a particular keyword. But the synonym may be replacing a term of art for which a synonym should not be used, and this "helpful" function of the database actually becomes a hindrance offering irrelevant results. 

This is good news for lawyers and librarians because the algorithms do not have the AI necessary to interpret case law, evaluate it, and contextualize it. This means that librarians and lawyers will be necessary well into the future. 

That is unless our databases get the AI equivalent of a law degree. There is a new book out called "The Formula" by Luke Dormehl that "tackles the rise of algorithms and artificial intelligence in art, politics, online relationships, and the law."

The book provides a high level view of how algorithms are changing our world. For lawyers, there are obvious political and legal implications that will need to be hammered out in the coming years.
As Dormehl notes, we need to educate the public on what exactly algorithms are and how they work, which is one of the major new roles of a librarian

Ultimately, Dormehl went on to discuss the ability for computers to replace lawyers and judges. He said that "if you were able to build a conceptual model of Judge Posner that would be 99 percent accurate in forecasting how he would decide a certain case, you could rely on that to decide cases rather than the person himself. But we're not there yet—and perhaps we never will be. As a nonlawyer, one of the great realizations I had while writing “The Formula” was the degree to which laws are not static entities that can easily be automated. The judicial process is less about a kind of mechanical objectivity than it is a high level of subjective agreement. It takes a human to resolve multiple parties' grievances, and to reconcile different interpretations of laws that are often written in such a way that their meaning can be argued. Machines can't do that yet."

Tuesday, November 18, 2014

Convenient Searching In Library Catalogs

An ongoing concern for librarians is the ease at which a user can search a library catalog. If the search is too convoluted, researchers will generally resort to resources that they are comfortable using, such as Google and Wikipedia, and forego the use of the library catalog and library resources.

To that end, "Yale Library Information Technology will beta test a new search system called Quicksearch, slated to replace the current Orbis catalog in September 2015. According to library administrators, the new platform will streamline the different search engines available across schools into a unified Yale library site. Library administrators said they hope the new system will resolve inconveniences in searching for texts. 'The library always had a multitude of systems that don’t necessarily talk to each other,' Chief Technology Officer for Yale Libraries Michael Dula said. 'We are aiming to pull the resources available at Yale under one umbrella, starting with main library catalogue and law library catalogue.'"

This is a wonderful improvement and makes a search in the library catalog much more convenient. The new platform will have a search window that "will appear as a split-screen, in which one column will display the merged catalog results and the other column will show academic articles accessible and licensed through Yale." This type of federated searching allows researchers to see all of the resources available to them and alleviates the needs to individually search separate catalogs or separate databases.

Yale noted that it used code created from Columbia University library to create the database. But the baseline code was specific for Columbia and had to be revised to make it work for Yale.

One Yale law student said that since he does not use the library catalog to search for law documents, he's not sure that the change will impact him. This bring to light the fact that not only do libraries and librarians need to create these type of catalogs, they also need to go a step further and promote the use of the catalogs. Once students understand that the library catalog will generally vet out reputable information automatically, they may see the importance of using it as a resource.

Monday, November 17, 2014

Decline In Bar Exam Scores Causes A Stir

We're seeing a national decline in bar exam scores, and it's causing a stir. The WSJ Law Blog reported that "[t]he overall passage rate for the Texas exam given in July, for example, was 11 percentage points lower than last year’s results. Idaho, Iowa, Oregon and Washington were among other states reporting sharp drops." And we'e already seen drops in other states.

In response to the lower scores, the president of the National Conference of Bar Examiners sent out a memo a memo addressed to law school deans across the country that defended the integrity of the group’s exam and raised concerns about the ability of the would-be lawyers who took it. "The NCBE is a national Wisconsin-based non-profit that prepares widely used standardized portions of the bar exam, including the Multistate Bar Examination, a multiple choice test that typically counts for half of a test-taker’s score."

The memo stated in part, “[w]hile we always take quality control of MBE scoring very seriously, we redoubled our efforts to satisfy ourselves that no error occurred in scoring the examination or in equating the test with its predecessors. The results are correct. . . All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013."

One dean at Brooklyn Law School fired back on Monday with a letter to the NCBE. The dean "said he found the assertions unconvincing and demanded a 'thorough investigation of the administration and scoring' of the July 2014 exam.'" The dean disagreed that the group that sat in July 2014 was less capable than the group that sat in July 2013 because "the median LSAT score for the 2013 and 2014 cohorts was 163 in both cases." But "the passage rate for Brooklyn Law School graduates who took the bar for the first time in July [2014] was nearly 10 percentage points lower than last year’s rate." 

With scores dropping all over the country, it is a good idea for the NCBE to look at the credentials of the students to determine if there was a significant drop in the capability of the takers. If not, the test should be called into question.

Friday, November 14, 2014

The LSAT & The Upward Trend In Transfers

The National Jurist reported that LSAT takers have hit a record low.

LSAC reported that "the number of students who took the October LSAT dropped to 30,943, down from 60,746 in 2009, when it hit a high mark. The numbers are off 8.1 percent from last year. It was the lowest number of test-takers in October since prior to 1987. The July exam was also off from the prior year, dropping from 23,997 to 21,803. The number of test-takers is on pace to be the lowest number since prior to 1987, when there were fewer law schools."

As law schools compete for fewer students, it may be easier than ever to get into law school even with a less-than-stellar LSAT score.

For prospective students who do score low on the LSAT, say in the high 130s, there is a good chance that you can start law school at a lower ranked school and transfer to a higher ranked school after your first year.

The ABA Journal reported "as law school enrollment declines nationwide, competition for transfer students is growing. Transfer students can benefit law schools, which increase revenues by accepting students whose LSAT scores don’t impact rankings by U.S. News & World Report. And transfer students benefit when they 'trade up' to a higher-ranked school after spending an initial year at a school that may be less expensive."

If it is your dream to attend law school, now may be the perfect time to play the odds and get into a school that may not have admitted you in years prior or to take advantage of the increase in transfers.

image: http://upload.wikimedia.org/wikipedia/commons/d/d8/Day_42_Overwhelmed.jpg

Thursday, November 13, 2014

July 2014 Michigan Bar Exam Results

Last week, the Michigan Board of Law Examiners released the seat numbers of the most recent Michigan bar exam passers.

The passage rate was 73% for first time takers with 563 people passing on their first try. The total passage rate for all applicants (including retakers) was 63% with 604 total passers.

The breakdown for Michigan law schools for all takers is as follows:

Thomas M. Cooley:  44 percent passed, 56 percent failed. (140 passed)

Michigan State University: 80 percent passed, 20 percent failed. (123 passed)

University of Detroit Mercy: 55 percent passed, 45 percent failed. (68 passed)

University of Michigan: 87 percent passed, 13 percent failed. (33 passed)

Wayne State University: 74 percent passed, 26 percent failed. (115 passed)

University of Toledo: 79 percent passed, 21 percent failed. (11 passed)

Others: 75 percent passed, 25 percent failed. (114 passed)

These are pre-appeal. It looks like the Michigan Board of Law Examiners has figured out a way to keep the test passage rate in the 60% range.

Congrats to all who passed. For more information on what to do after you passed, check out the State Bar of Michigan's website.

Wednesday, November 12, 2014

Inter-American Court of Human Rights Database

For those interested in human rights legal research, you should be aware of the Inter-American Court of Human Rights Database.

The Organization of American States established the Inter-American Court of Human Rights in 1979 to enforce and interpret the provisions of the American Convention on Human Rights. Its two main functions are thus adjudicatory and advisory. Under the former, it hears and rules on the specific cases of human rights violations referred to it. Under the latter, it issues opinions on matters of legal interpretation brought to its attention by other OAS bodies or member states.

The Inter-American Court of Human Rights (IACHR) Project of the Loyola of Los Angeles International and Comparative Law Review has released its Inter-American Court of Human Rights Database. This freely-available database produced by the editors and staff of the IACHR Project under the supervision of Professor Cesare Romano allows users to search Inter-American Court decisions by case name, country, and topic. Advanced search features include the ability to search by specific violation of various Inter-American Conventions.

Search results include a brief description of the case, information on judges, and violations found by the Inter-American Court. When available, the database includes a link to a detailed case summary which includes case facts, procedural history, merits, and state compliance with the Inter-American Court's judgment. To date, 74 detailed case summaries are available.

The IACHR Project welcomes comments and suggestions and can be reached at iachrproject@lls.edu.

Monday, November 10, 2014

Tips For Archival Research

It's great to have an understanding of the necessary tools of the trade to make research more efficient.  Gradhacker recently posted six tools that make archival research more efficient.

It's important to do archival research efficiently because "grant budgets can only go so far, it’s important to make use of every minute when visiting faraway libraries and repositories, capturing as much information as possible."

6 tools for efficient archival research include:

1. Apps for managing finding aids: Store a copy of the finding aid for each collection in GoodReader on an iPad in PDF format. GoodReader allows for annotating and highlighting, noting material that the researcher would like to find in a particular resource. Next, use trusty old Google Drive to make master plan and stay one task.

2. Camera: A DSLR or iPhone depending on resolution needs.

3. Wireless SD card: Eyefi, for example, makes an SD card that fits right into your camera and then, via wifi, automatically uploads your images to your phone (and even the web) as you shoot them. The point is to make your camera operate more like your smartphone, making it easy to share images instead of relying on cords to upload them to your computer.

4. Table grip:  Invest in a table mount for your camera

5. Remote control: In order to use the table grip efficiently, you can rely on a simple remote control.

6. Scanner apps: ProfHacker featured a post on digital workflow in the archives last year, and the author recommended using Turboscan, an iPhone app that allows you to convert images to PDFs.

It's great to learn from another's efficient archival research process to understand what works well.

Friday, November 7, 2014

The Perils of Focusing On ROI

Over at InsideHigherEd, Library Babel Fish (aka Barbara Fister) posted an interesting 'stream of consciousness' article discussing the perils of assessing return on investment in libraries.

Fister started thinking more about ROI because she was alerted to a new position for a "librarian whose role would be assessment and marketing. The library is seeking a librarian who is 'interested in using the results of library assessment to promote the value of the library to the university as part of our our strategic communications program.' The audience is the higher administrators who probably don't use the library but hold the purse strings."

As Fister notes, there is a conflict of interest when doing assessments. "When we do assessment, are we honestly trying to find out what is going on with our students so that we can figure out what practices improve or inhibit learning? Or are we simply trying to demonstrate a good return on investment?"

In a perfect world, libraries would do assessments to determine the library's value to the students. In other words, we would determine if our collections, instruction, and other student interactions are effective. If not, we would take our findings and try new things to reach our audience. These assessments would not be evidence to argue that the library should not lose funding or shut down. After all, most academics can admit that the library is at the heart of all higher education.

But we find ourselves in a time when administrators are interested on return on investment and assessment to determine if libraries are still needed. The library is often a huge expenditure for a university, and rightly so, and it's a tempting place to tighten the purse strings.

The problem with assessing for market value is that we often lose the opportunity to assess if we are getting it right. Because "[i]f participants thought they were being judged, they would be less inclined to take risks or admit that they see room for improvement." And this stifles the ability to ask honest questions about the effectiveness of the library.

As Fister points out, "[i]t seems [that] there needs to be a line drawn between honestly assessing how well various campus units fulfill their missions and making a case for their very existence to upper administration. When assessment becomes market research - or evidence that we shouldn't be shut down - I think we're losing the opportunity to ask honest questions."

Tuesday, November 4, 2014

Is An Open Source Bluebook On The Way?

Is it true? Is The Bluebook really in the public domain? 

The Lawyerist reported that "Harvard Law Review who has been aggressively protecting the copyright of the Bluebook against all those who would let legal citation free into the wild forgot to renew the copyright on the 10th Edition."

Professor Christopher Jon Sprigman from New York University School of Law sent a letter stating that his client, Public Resource, intends to publish an electronic version of the 10th edition in light of its public domain status. 

In addition, Professor Sprigman calls the copyright protection of the 19th edition into question. "[N]umerous courts have mandated use of The Bluebook. As a consequence, The Bluebook has been adopted as an edict of government and its contents are in the public domain. But even if we lay that point aside (which, of course, we would not), very little of the 19th edition can be construed as material protected by copyright. Many portions of the 19th edition are identical to or only trivially dissimilar from public domain material contained in the 10th edition. Other portions of the 19th edition are comprised either of material entirely outside the scope of copyright, or material which merges with the system of citation that The Bluebook represents. These portions of the 19th edition are likewise available for public use."

The new public edition project is called Baby Blue. The "project will mix public domain portions of the 19th edition with newly- created material that implements The Bluebook’s system of citation in a fully usable form. In short, The Bluebook will soon face a public domain competitor. And when Baby Blue comes to market, The Harvard Law Review Association is likely to face questions regarding why the public – including pro se and indigent litigants – are obliged to pay for access to a resource that is indispensable to all those who seek justice from our courts."

The letter is an attempt to inform the Harvard Law Review Association of Public Resource's intentions. With a final plea to the Harvard Law Review Association to "recognize the important place our legal citation system plays in our system of democracy and not stand in [the] way."