Wednesday, April 13, 2016

Crowdsourcing: Weather on the Oceans and Shells on the Beaches

 The collections of libraries, archives and museums continually expand while at the same time, budgets and staff contract. Objects and documents are marked for digitization, but languish until an intern can be assigned the task. Other objectives, such as transcription, beckon from the far reaches of archival storage but remain in the background as staff tends to more pressing needs. The magic of the internet and the numbers of people who use it seem to offer a solution in crowdsourcing. But it is not quite as magical as all that: obstacles abound in the attitudes of curators, librarians, and archivists; in the trepidation of the populace, and in the actual mechanics of getting the job done. If the obstacles can be overcome, that is if the expert staff is willing to accept the assistance of the possibly inexpert populace and the task presented in such a way as to attract helpful participants and enable them to perform the task, the work may get done. At the very least, it will be begun.
The first project I chose is a joint venture between the National Archive and Records Administration (NARA) and the National Oceanic and Atmospheric Administration (NOAA) which involves transcribing ship’s logs for weather information. The goal is not only to understand past weather patterns, but by adding to that knowledge, hopefully gain insight into forecasting, especially where climate change is concerned. NARA is involved as the record keeper, as befits their status as the nation’s records keepers; they provided Navy, Coast Guard, and Revenue Cutter’s logs for the project.[1] The dates run from the mid-19th century through World War II. NOAA is involved because understanding weather is a large part of their work.
The NARA mission statement as follows:
 Our Mission is to provide public access to Federal Government records in our custody and control. Public access to government records strengthens democracy by allowing Americans to claim their rights of citizenship, hold their government accountable, and understand their history so they can participate more effectively in their government.[2]
Clearly, these ships logs are government documents (as the ships they document were in government service), so in keeping with NARA’s mission the transcription project makes them more accessible to the public.
NOAA’s mission statement consists of these points:
1. To understand and predict changes in climate, weather, oceans and coasts;
2. To share that knowledge and information with others; and
3. To conserve and manage coastal and marine ecosystems and resources.[3]
The goal of this transcription project, to enable better knowledge of past weather patterns and forecast future ones, is completely aligned with their mission statement. Furthermore, these ship’s logs provide weather reports that are not available from any other source. Even if weather records were being kept in this country at the time of these sailing voyages, they would not provide any information for areas other than the United States. The ship’s logs have information from the world’s oceans, which undoubtedly provides a greater picture of the climate patterns across the globe.
           
From the Citizen Archivist page on the NARA website I followed a link to Oldweather.org and from there I had a choice between Old Weather Whaling and Old Weather Classic. I wasn’t sure which one held greater appeal so first I picked Classic to see what it was about. I was happy to see that they advertised their statistics right up front: The project is 78% done, with 127,926 pages from 13 voyages transcribed.[4] These voyages were generally in the late 19th and early 20th century and I preferred something earlier, so I moved on to the whaling page.
That page was set up a little differently: At a guess, looking at the style of each, I would say the classic page is designed to appeal to younger users and the whaling page to older ones. From the whaling page the participant can choose a ship and, in most cases, learn a little about it such as its home port, Master, and where it was built. Once you choose a ship you can see the data for that ship’s project. I chose the Eliza Adams out of New Bedford, Master Coddington P. Fish, (you can’t make this stuff up) overall 6% is estimated to be completed.[5] The Eliza Adams was a whaler; she would have sailed the Atlantic, around the Horn and into the Pacific to the whaling grounds, and likely encountered some wild differences in weather along the way.
I was very excited about the chance to read a ship’s log, having grown up as the daughter of an avid sailor who taught me a love of old ships. Sadly, this project is only looking for weather information. The sections to be transcribed are highlighted and the dialog box in which I entered the transcription covered much of the page, so I was only able to read snippets of the rest of the log. I did notice some wonderful pen and ink drawings of whales in the margins, but I would have loved to be able to just transcribe whole pages and be able to read the stories. Nevertheless, once I started it was very hard to stop; it was as if I thought if I transcribed one more “fine weather” I would be rewarded with a tale of a hurricane, or some other great storm. I wasn’t, but I did get to read about pack ice, which ameliorated my disappointment somewhat.

The second project I chose was the Atlas of Living Australia from the Australian Museum. This project has a few choices with very different skill sets and interests: Transcription of historical documents, data capture from specimen and object labels, identify and tag animals and objects. The link to the “learn more” page worked, but the page itself (which contained links to various bits of information) did not. I returned to the homepage and clicked “get involved” which took me to a live page with about half a dozen projects to join. Statistics on this page note that 1524 volunteers have completed 311,963 tasks of 322,753.[6]
I chose the Australian Museum Bivalve 25 Expedition, which consists of transcribing labels on bivalve (think clams, scallops, mussels, etc.) specimens, I assume, from the museum’s collection. There was a tutorial so I thought I would check that out, as this sounded a little more complicated than the weather project. The tutorial was a 14 page PDF. I did skim through it, but sad to say, I did not then participate in the project. I felt as though I didn’t have the time and this one needs time. The transcriptions include latitude and longitude of the item’s origin with specific instructions on how to make the proper notations for degrees and minutes. There is an entire page of abbreviations of physical locations that might appear on the labels, and what they mean. There are instructions of what to do if the same information appears in two places, as some of these objects have more than one label. The project has 4 volunteers working on it and I admire their dedication; they have transcribed 19% of the 582 tasks.[7] They have a long way to go and could probably use more help. But how to get that help?
This speaks to the challenges of broadcasting the need for volunteers and the motivations of those volunteers. I stumbled upon both projects for this post while reading articles about crowdsourcing. At first I passed by the project on bivalves, but I was having a difficult time finding a project that I thought would hold my interest. Internet searches had not returned the results I sought. Using articles that named crowdsourcing organizations, I searched their websites for appropriate projects. All too often the projects I found interesting had failed. Some projects I investigated had only small mention on the website of the institution which was to benefit from the work and no mention elsewhere that I could find. (It made me wonder how much resistance there was internally to crowdsourcing that project.) So I returned to the Australian Museum and their bivalves, but ultimately did not work on the project. For crowdsourcing to be successful, project managers must first effectively advertise their need; they must understand what draws someone to the project and what makes them stay.
Once potential volunteers have discovered the project they must find some allure, some connection and possibly a host of other factors, in order to stay. Some projects have had success using humor and games to bring participants onboard.[8] Some have appealed to the passion people have for a subject and have been successful by staying focused on the help they do have, rather than trying to add more people who might not be as dedicated.[9] Unknowingly, I chose two very different projects: One that was transcription of very short phrases and one that was extremely detailed scientific information. Both appealed to me, but I participated in one and not the other. In my case it was a question of time, not interest that made my decision. Other elements, besides personal interest, factor into the equation and must be evaluated in order for a project to succeed.
Trevor Owens addresses participant motivation in his article on crowdsourcing and concludes “People identify and support causes and projects that provide them with a sense of purpose.”[10] Certainly that is part of the draw, but it doesn’t go deep enough. A potential volunteer may be attracted to a project that is meaningful, but that doesn’t guarantee they will actually participate. If we were to have some demographic information on the 4 bivalve volunteers we might find they are retired, or that they are marine biologists, or that their grandmothers were avid shell collectors who taught them the Latin names of shells as she walked the beach with them (as mine did, which caused me to investigate the project), or some other combination of factors that piqued their interest and allowed them the time to participate.
For those institutions that welcome the help of the layperson, it seems there is nearly as much work to set up a successful project as to just do the project, and perhaps therein lies the difficulty. The project must be set up in such a way as to draw people in, to make them feel as though they could be helpful, that they have the expertise. It must reach the appropriate audience: the institution’s own website will reach those who are a part of the community, but what about people from other parts of the country or world who are part of similar communities? The project itself must be set up in such a way as to be accessible, understandable, and doable. All of this takes funds, time, and people. I wonder if, for many places, whether it is an equal strain on resources to do a project or to set it up for crowdsourcing.
My experience with both of these transcription projects tells me that, for me, I do best at volunteer work when I have an emotional connection to the work. Even with that in my favor, I must still make the decision based on the amount of time I have to dedicate to the project, the amount of energy and drive. I would rather not be a part of a project than to do it badly. In order for crowdsourcing to be successful institutions may have to be prepared to be more specific about what is entailed in the project, the amount of time a participant may expect to spend, and the level of expertise s/he needs. The participant will need to decide for him or herself if they have the passion the project requires.


[1] “Citizen Archivist Dashboard; Transcribe Old Weather,” National Archives, http://www.archives.gov/citizen-archivist/old-weather/ Accessed 6 April 2016.
Revenue Cutters were a precursor to the United States Coast Guard, and were instituted in the 18th century to enforce the revenue laws of the new nation, which was struggling financially after the Revolutionary War. In other words, they acted as customs officials and made sure that goods were brought into and out of port legally with all taxes and tariffs paid. Letter from Alexander Hamilton, dated June 4, 1791, to commanding officers of revenue cutters. https://www.uscg.mil/history/faqs/hamiltonletter.pdf  Accessed 6 April 2016.
[2] “About the National Archives: Our Mission and Vision,” National Archives, http://www.archives.gov/about/info/mission.html Accessed 8 April 2016.
[3] “Our Mission and Vision,” National Oceanic and Atmospheric Administration, U.S. Department of Commerce,
[4]“Old Weather: Our Weather’s Past, the Climate’s Future,”  https://www.oldweather.org/?_ga=1.88007901.416840962.1460548831#/
[5] “Eliza Adams 1863,” Old Weather Whaling, https://whaling.oldweather.org/#/groups/560466483937640006140100  Accessed
[6] Homepage, Digivol, http://volunteer.ala.org.au/#expeditionList Accessed 8 April 2016.
[7] “Australian Museum Bivalve 25 Expedition,” Australian Museum, http://volunteer.ala.org.au/project/index/12362312  Accessed 10 April 2016.
[8] Henry van Vliet and Erik Hekman, “Enhancing User Involvement With Digital Cultural Heritage: The Usage of Social Tagging and Storytelling,” First Monday vol. 17:5, May 2012.
[9] Trevor Owens, “Digital Cultural Heritage and the Crowd,” Curator The Museum Journal vol. 56:1, January 2013.
[10] Owens, “Digital Cultural Heritage and the Crowd,” January 2013.

No comments:

Post a Comment