Author Topic: Introduction  (Read 206 times)

RichB

  • Global Moderator
  • Newbie
  • *****
  • Posts: 37
    • View Profile
Introduction
« on: August 17, 2020, 04:06:24 pm »
4) Topic: Quality control, Correction of records, Normalisation rules

Once you’ve decided on what a ‘quality’ record looks like how do you ensure that it becomes one? Editing, or correcting, records can be a time consuming and expensive affair (in terms of staff time and especially if you have already paid for records that don’t aid discoverability). How do we demonstrate the value of what we are contributing to our colleagues, particularly those who are budget holders?   

Questions to consider:
•   Are the any processes or normalisations that you use routinely on shelf ready records? Is this via MarcEdit or your LMS? Or a combination of the two?
•   Is this prior to the EDI process or at the point of receipt?
•   Do you have to edit records manually? And do you have the resources to do this?
•   Are you just correcting errors or are you improving the record by adding more detail? Or are you doing both?

Share on Facebook Share on Twitter


RichB

  • Global Moderator
  • Newbie
  • *****
  • Posts: 37
    • View Profile
Re: Introduction
« Reply #1 on: August 19, 2020, 01:00:58 pm »
Again there may be some repetition but in response to the questions:

1) No, with our current workflows, suppliers it hasn't made sense to do this but if we increase our number of suppliers then I think I'll need to look at it again in order to maximise time/effort.
2) N/A but I think I would look to do this at the point of receipt - as you know what you've got!
3) We've already touched on this - yes, we (well, I) do manually intervene but only with items that are 'exceptions'  (just wrong, poor quality, partial information, poorly displaying and so no). I think the number of exceptions may be influenced by the type of material we have - the arts in general & music in particular, all benefit from more intervention.
4) Both - ironically, recently it's been mainly removing poorly written/over filled fields that don't display in our LMS properly - and this is where normalisation rules would help.

A question for everyone - how can we demonstrate to our SMTs that our interventions do improve discoverabilty. Is there any research?

RichB

  • Global Moderator
  • Newbie
  • *****
  • Posts: 37
    • View Profile
Re: Introduction
« Reply #2 on: August 19, 2020, 01:20:59 pm »
Do others not perceive this as a problem? In our previous discussions people had raised the question of MarcEdit training - does anyone use MarcEdit in relation to normalisation rules? 

dboyes1

  • Newbie
  • *
  • Posts: 5
    • View Profile
Re: Introduction
« Reply #3 on: August 19, 2020, 01:25:56 pm »
We use a combination of MarcEdit and our LMS on our shelf ready records, at the point of receipt. We do some manual editing as required according to our house standards and any extra notes for Special collections etc. We have two members of staff cataloguing/editing and upgrading records. We add more detail and notes as required to aid discoverability. As regards influencing and gaining interest from SMTs, I have seen some enquiries/emails re the use of ORCIDs in Marc records. If this aspect extends discoverability for authors and their works, then is this of interest for the REF process and networking?
Like Like x 1 View List

Corinne Lambert

  • Newbie
  • *
  • Posts: 18
    • View Profile
Re: Introduction
« Reply #4 on: August 19, 2020, 01:28:49 pm »
"A question for everyone - how can we demonstrate to our SMTs that our interventions do improve discoverabilty. Is there any research?"

It can be useful to keep copies of correspondence with readers regarding issues with discoverability and use them as evidence of the importance of good metadata. I don't know if there is any resesearch on this area -  a good dissertation subject for someone perhaps ?



Jason Starksfield

  • Newbie
  • *
  • Posts: 7
    • View Profile
Re: Introduction
« Reply #5 on: August 19, 2020, 01:29:38 pm »
We're quite lucky that the head of our dept is the deputy director for the library and is sympathetic to our anecdotal evidence regarding the need for our intervention, particularly since our discovery systems seem to be so patchy.

I did a quick search to see if there's any research specific to this question but if there is, my library hasn't got access (or our discovery system can't find it...). A few articles about how the lack of consistency can be a hindrance, though.

We, well I, have just started looking at the whole linked data thing which will be a whole new skill set to develop. Its probably a bit much to ask that any imminent standards can include this capability seeing as how the library ecosystem in general doesn't seem to have onboarded it yet, but something of the potential power here is something we should at least aspire to.

Since working from home, I have gradually been using MarcEdit to upgrade some old records to the RDA standard, and consulting the NBK for extra info such as contents and extra subject headings, etc. Because this is a laborious one-record-at-a-time process, it may take a little while to see what effect this may have on discoverability and subsequent uptake.
Like Like x 1 View List

RichB

  • Global Moderator
  • Newbie
  • *****
  • Posts: 37
    • View Profile
Re: Introduction
« Reply #6 on: August 19, 2020, 01:30:28 pm »
Thanks, that sounds similar to my own experiences. We've also been looking recently at including ORCIDs, it's something SUMMON 'switched on' automatically recently, I do think that it would beneficial to the research community but am less convinced for the 'general' user searching for resources.

RichB

  • Global Moderator
  • Newbie
  • *****
  • Posts: 37
    • View Profile
Re: Introduction
« Reply #7 on: August 19, 2020, 01:36:00 pm »
I agree with Corrine & Jason that 'user stories' are extremely helpful, and you're lucky you can get them I can't, but quantifiable data is often what is required (i.e. data led decisions). Will we ever be in the situation when we can say x amount of changes to a record = x amount increase in usage? 

Nick Williams

  • Newbie
  • *
  • Posts: 30
    • View Profile
Re: Introduction
« Reply #8 on: August 19, 2020, 01:37:24 pm »
Our quality control for e-book package vendor records that we load into Alma is to use MarcEdit to ensure the total number of records corresponds to the number of books Acquistions says it has has bought and that each record contains at least the following:  LDR, 006, 007, 008, 020, 035, 040, 100, 245, 250, 264, 300, 6XX, 856. The trickiest bit is making sure that subfiled u of the 856 is always the first subfield in that field because that’s what the import profile uses for creating the portfolio record and hence access.

When doing quality control on generic print book shelf-ready records in a previous role, I just checked that the required felds for every book were present rather than in detail checking that the contents of those fields were actually correct. There wasn’t time to do more. I appreciate that some might claim that this isn’t actually quality control.
« Last Edit: August 19, 2020, 03:18:18 pm by Nick Williams »
Like Like x 2 View List

Corinne Lambert

  • Newbie
  • *
  • Posts: 18
    • View Profile
Re: Introduction
« Reply #9 on: August 19, 2020, 01:38:25 pm »
At University of Leicester most of our new stock e and p is SR with records by suppliers. We use Alma LMS and apply our own written normalisation rules to the new marc import files as they are ordered.

Newly ordered marc files individual items  are reviewed daily by the Metadata Specialists (1.6 fte) who can edit any that give cause for further concern, so e are accurate asap after activation and print can be received and go straight to the shelves. We scan down the list of titles and edit any that are not ok.  There are a few exceptions of course multi vol sets seem to be tricky for suppliers to get right.  The big packages we tend to fix titles when readers report issues with them, this is the same with legacy data issues.

We have had issues with RDAAACR2 hybrid records which have multiple RDA and AACR2 fields.

We have recently devised a search for "brief" records in our catalogue but that is not aacr2 brief it is Alma considered brief, interestingly the marc field for record level is not always accurate to the state of the record, a reflection on how important those LDR 008 codes are for this sort of collection housekeeping work.

Corinne

Jason Starksfield

  • Newbie
  • *
  • Posts: 7
    • View Profile
Re: Introduction
« Reply #10 on: August 19, 2020, 01:49:33 pm »
'Will we ever be in the situation when we can say x amount of changes to a record = x amount increase in usage? '

This would need a quite long term study, I think. I could perhaps start something with the upgrades I've done, but I suspect I'll need to be able to at least get information on record views out of the catalogue. I'll have a word with my systems man to see if this is possible.

If it is possible, then I could take a look at the previous x months total of views, and compare it with the subsequent period. Would need to control for whether extra paths to the record are added, though, through resource lists for example.
Like Like x 1 View List

RichB

  • Global Moderator
  • Newbie
  • *****
  • Posts: 37
    • View Profile
Re: Introduction
« Reply #11 on: August 19, 2020, 01:55:10 pm »
Jason, this would be a useful piece of work. As fellow Capita users (you are still with them aren't you?) I'd be interested in talking to you about this but, as you say, there's no instant answer and 'system analytics' don't (and can't?) do this.

Jason Starksfield

  • Newbie
  • *
  • Posts: 7
    • View Profile
Re: Introduction
« Reply #12 on: August 19, 2020, 02:04:53 pm »
We are currently still using Capita. There is talk of moving, though, which may hinder things, though no specific timeframe. I'll see what analytics I can get for historical data, though.
Like Like x 1 View List