New Functional Schedule for North Carolina State Agencies

A years-long project at the State Archives of North Carolina has culminated with the publication of the first Functional Schedule for North Carolina State Agencies.  Where state agencies have previously relied in a General Schedule for State Agency Records and hundreds of program-specific schedules, now all state agency officials have one 16-part retention and disposition schedule to guide them in the management of their public records.

In 2015, the Records Analysis Unit of the Government Records Section at the State Archives of North Carolina (SANC) began a project to revamp the retention and disposition schedules for state agencies in North Carolina.  Our overarching goals of the project were to simplify records retention, make the assignment of records dispositions more transparent, and ensure the retention of records with permanent value, either within the creating agency or at the State Archives.  We embraced the technique of functional analysis, whereby the functions of an institution are defined and the records that document these functions are linked.  Sixteen functions of North Carolina state government were identified, their record types listed, and disposition instructions provided.

Over 200 stakeholders from across North Carolina state government participated in meetings to review draft schedules for each of the 16 functions, and many staff at SANC in addition to the Records Analysis Unit also provided constructive feedback.  After these schedules reached their final draft stage, records analysts worked in concert with state agencies to crosswalk their records inventories to the new functional schedule.

These functional schedules standardize disposition instructions across State government and focus on the function of government that necessitates the creation of a record rather than on the particular agency that creates or maintains the record.  Therefore, users will not need to find relevant record types based on agency hierarchy but instead can identify record types relevant to the particular function of government they perform.  In addition, if the responsibilities of an agency change over time, the appropriate retention and disposition instructions for the records generated by this new function are already identified among the 16 functions identified above.  Realizing that an increasing share of state agency records are being created and maintained electronically, we attempted to group records with similar functions in “big buckets” in order to facilitate the appropriate disposition of these records that are housed in document management systems.

These new schedules can be viewed at https://archives.ncdcr.gov/documents/functional-schedule-state-agencies.  For a more in-depth look at the process behind this project, check out the case study published by SAA’s Government Records Section.

Advertisements

Request for information: RM bibliography

As Beth Cron posted last December, the Records Management Section steering committee has been working to update the Records and Information Bibliography that was first published in 2008.  We chose to make this new version accessible through Zotero so it can be more dynamic and collaborative.

This is where you come in: we’d like to invite volunteers who can assist us with revisions.  While we’ve tried to update broken links and remove dated or inaccessible resources, we know there are recent studies and articles that have been overlooked.  You are welcome to submit individual items for inclusion, but we’re also seeking volunteers who’d be willing to take on a systematic search through a particular journal.  Here are the publications whose resources we’d like to include in our bibliography:

To contribute to the bibliography, you must first create a Zotero account.  Then you can request access to join the SAA RMRT Group.  Please contact Courtney Bailey if you are willing to review the 2008-present articles of one of the above publications to identify the relevant records management articles.  Thanks in advance for helping us improve this valuable resource.

Follow up on Open Source Hangout

Beth Cron and I had a good conversation with Seth Shaw and Jeremy Gibson about open source records management tools.  If you didn’t have a chance to join us last week, you can still view it at https://www.youtube.com/watch?v=1GRQBUjtOT8.

Here are some of the highlights:

  • Open source tools can be very valuable when they have a strong community around them.  This support leads to active development, which makes for a sustainable tool.  Being able to see other people’s work also provides more entry points to solving your own problem.
  • The potential downside to open source tools is that they tend to have less documentation, and there’s the potential for projects to be abandoned.
  • Seth mentioned an interview on The SIgnal about using open source tools in cultural heritage institutions.  You can find it at https://blogs.loc.gov/thesignal/2013/01/when-is-open-source-software-the-right-choice-for-cultural-heritage-organizations-an-interview-with-peter-murray/.
  • Seth has a report pending publication as part of OCLC’s Demystifying Born Digital series (http://www.oclc.org/research/publications/library/born-digital-reports.html).
  • Jeremy pointed out that one of the benefits of open source tools is that you can more easily find tools that do one thing very well — and then stitch those solutions together to accomplish all of your RM needs.
  • In answer to a query about extracting metadata from media files, Jeremy pointed to MediaInfo and JHOVE.
  • One of the particular gaps identified in existing open source tools is one to handle redactions.
  • Before adopting a records management tool, it’s important to document your functional requirements and your organizational requirements (e.g., budget, IT support).  Only then can you make sure you’re choosing the right tool for your purposes.

2 upcoming RM Google Hangouts

The Records Management Section is planning two upcoming Google Hangouts!

The first is on Thursday, February 2 at noon Eastern on Open Source Tools for Records Management. We will be joined by Seth Shaw from Clayton State University, who will discuss the Data Accessioner, and Jeremy Gibson from the State Archives of North Carolina, who will talk about renameIT as well as setting up an AXAEM system. They will discuss the pros and cons of using open source tools — both from the developer and the user perspectives — and provide some advice about implementing them.

The second is on Wednesday, February 8 at noon Eastern and will be on the topic of police body camera footage. Snowden Becker from the UCLA Department of Information Studies will discuss the following questions: Where is the point of intersection between the evidentiary *value* of records, as an archival concept, and the records that actually constitute criminal evidence? What does the legal duty to preserve have to do with the material preservation of records in new formats like digital video, or evidentiary traces created by Internet of Things-enabled (IOT) devices? This discussion will touch on some of the parallels between records management and evidence management in public agencies. We’ll explore how police body-worn camera programs are presenting new challenges to the very definition of public records, and shining light on the widely varying practices of records creation, collection, and retention in the criminal justice system.

Be sure to tune in live to ask questions or watch later at your convenience. You can view the Open Source Hangout here and the police body camera footage Hangout here .

For both Hangouts, we’ll be accepting questions for our speakers from you.  If you have a question or topic for discussion please leave it as a comment here or use the #saarmrt hashtag on Twitter.  We will also monitor the comments on the YouTube live streaming page.

Open Source Tools

To round out this year’s look at open source tools, I want to provide an overview that can serve as a primer to the Hangout we intend to host early in 2017.  Open source software is “software with source code that anyone can inspect, modify, and enhance.”  Opensource.com goes on to provide a more expansive explanation of the purpose of open source software:

“Open source projects, products, or initiatives embrace and celebrate principles of open exchange, collaborative participation, rapid prototyping, transparency, meritocracy, and community-oriented development.”

As I noted in previous posts, NARA published a report in 2015 entitled “Open Source Tools for Records Management.”  This report points to the generally free cost of open source software and the “very robust user and developer communities that are actively working to report bugs and improve the tools” as advantages of its use.  However, this report also acknowledges (1) care must be taken to guarantee adequate security when deploying open source software and (2) customization may be required — which will probably also necessitate time and IT know-how.

Open source software differs from closed source, or proprietary, software because its code is open to all to see and change.  Although open source software is often times provided free of charge, it is not the same as freeware, which may be closed source software.  Notable open source technologies include the Linux operating system and the Apache web server application.  The Linux OS is a good example of the practice where the software is open source but the support comes with a price — such as that provided by Red Hat.  Opensource.com lists a number of reasons why developers prefer open source software:

  • control over what the software does
  • training by being able to see the source code
  • greater security due to quicker updates to address vulnerabilities
  • stability even after the original creators discontinue their work on the software

In August 2016, Wired released an article entitled “Open Source Won.  So Now What?”  This article points to the first official federal source code policy, which requires government agencies to release at least 20% of any new code as open source software.  It also acknowledges that open source development can be challenging due to lack of funding and because it’s hard to break into the field, which is increasingly being dominated by big companies.

If you’re interested in open source software, stay tuned for more blog posts and our Hangout in 2017.

Open Source Tools: MUSE

Today I offer my third post in a series based on NARA’s 2015 report “Open Source Tools for Records Management.”  I investigated MUSE (Memories USing Email), which was developed at Stanford University. It is available for use in a Windows, Mac, or Linux environment.  I conducted my tests using Windows.

This program is a visualization tool for analyzing emails.  It is still in active development, and it currently incorporates six tools:

  • tabulates topics
  • charts sentiments
  • tracks the ingress and egress of email correspondents and groups people who “go together” based on their receipt of messages
  • allows you to browse attachments on a Piclens 3D photo wall
  • offers the possibility of personalizing web pages by highlighting terms also found in your email (requires the use of a browser extension)
  • creates crossword puzzles based on your email archive

muse_acct-typeOnce you download the executable file from the above site, the program runs locally on your computer.  Muse can be deployed on a static archive of email messages (e.g., an mbox file) or it can fetch email from online accounts for which you have the email address, password, and server settings.  It defaults to analyzing Sent mail, based on the principle that those messages more accurately reflect the topics and people with which the account owner is most engaged, but you can also include additional folders.  You can then browse all messages in the embedded viewer — without having to open each message individually — or you can use any of the tools listed above.

muse_sentimentsThe sentiment analyzer using natural language processing and a built-in lexicon, but it can be customized by the user to identify desired terms (see Edit Lexicon highlighted above to access the screen below).

muse_lexiconAccording to their tip sheet for journalists, MUSE “was originally meant for people to browse their own long-term email archives.  We have now started adapting it for journalists, archivists and researchers.”  Due to the ease of use of this lightweight tool, this could be an easy way for repositories to provide an email analysis tool to researchers.  This same tip sheet defines the “sweet spot” for the software as archives with about 50,000 messages.

If you’re interested in learning more about MUSE, a Ph.D. dissertation and a number of papers are available here.  There’s also a video that argues for the value of analyzing personal digital archives.  This project dovetails with the work being done at Stanford on ePADD — check out our Hangout for more information on that project.

SAA Session 707: “Hindsights and Fresh Perspectives: Records Management Programs Learn from Each Other”

This records management session featured participation by several RMRT steering committee members, with Alex Toner (University of Pittsburgh) moderating the session and Hillary Gatlin presenting.

Anita Vannucci of Emory University emphasized the importance of knowing open/public records laws.  She suggested prioritizing work with the people who want to work with you – and then leverage this work to advocate for additional resources.  She has found it useful to look to her state archives for resources that can be borrowed or adapted and to find out what peer institutions are doing.

Donna McCrea from the University of Montana looked to the American Association of Registrars and Admission Officers Retention, Disposal, and Archive of Student Records (2013) for guidance.  They created a RRS upon the directive of the Commissioner of Higher Education.

Hillary Gatlin from Michigan State University focused on records destruction.  At MSU, the Director of Archives must approve records destructions, so they’ve developed a form that can be seen here.

Daniel Noonan from the Ohio State University reported on their general schedule and department-specific schedules.  The Inter-University Council of Ohio developed a new schedule in 1992 after the universities were “liberated” from the state records management system.

Johna Von Behrens from Stephen F. Austin State University said an internal audit is a good means of identifying the risks of poor record management:

  • non-compliance
  • records not appropriately classified and identified
  • recordkeeping process not effective
  • records (paper and electronic) not adequately safeguarded
  • inadequate record retention management
  • process not communicated

Mary McRobinson reported that Willamette University began a records management program in 2010, and because their archives staff had no bandwidth for this additional work, they brought in outside consultants to devise retention and disposition schedules.  Their process was as follows:

  • set up steering committee with stakeholders
  • sent out RFP
  • consultants toured campus, interviewed departments, developed retention and disposition schedules
  • consultants also produced guidance report – current situation, implementation, etc.
  • RM program is introduced at new employee orientation
  • individual training of departmental liaisons is coordinated by RM program

Virginia Hunt from the Harvard University Archives said their RM program was established in 1995 by a corporation vote.  They ultimately combined collection development and RM services.  They’ve found web archiving to be an effective form of outreach.