Subject: | |
From: | |
Reply To: | |
Date: | Fri, 5 Feb 2021 18:08:39 +0000 |
Content-Type: | text/plain |
Parts/Attachments: |
|
|
To a human, this is redundant data, but to a computer, this is not duplicate data since it is not identical to the content of any other individual field. The remove duplicates tool will not help in this case.
Assuming that the fields you want to keep have exactly one occurrence of the “ — “ (that’s <Space><Dash><Dash><Space>) separator element
AND
the ones you want to delete always have at least two of them,
You can use the Delete Field tool:
Set the Field to 653
Find: \s—\s.+\s—\s
and check the box to use regular expressions.
This will find and delete any concatenated data fields that have at least two <Space><Dash><Dash><Space>) separators.
Sorry, I’m not where I can get you a screenshot.
DougR
Sent from my mobile, which is solely responsible for all nonsensical auto-corrections.
> On Feb 5, 2021, at 9:22 AM, Eugene Espinoza <[log in to unmask]> wrote:
>
> I have the following 653s in my record:
>
> =245 \\$aACT NO. 2888
> =653 \\$aFRANCHISES (GRANTEE) -- COMMERCIALPACIFIC CABLE COMPANY FRANCHISES (TYPE) -- UNDERGROUND CABLE SORSOGON (PROVINCE)
> =653 \\$aFRANCHISES (GRANTEE) -- COMMERCIAL PACIFIC CABLE COMPANY
> =653 \\$aFRANCHISES (TYPE) -- UNDERGROUND CABLE SORSOGON (PROVINCE)
>
> These were converted from csv via delimited text translator. As you can see the first 653 is a duplication of the succeeding 653s in one line. Other 653s in this record set are not like the case stated above. So I have a mix of just normal 653s and the case abovementioned. The thing is, I have already edited some records manually and merged records so redoing it for now is out of the question (I have 15000+ record). I still have the csv for this record that explicitly has the use case as mentioned above. My question is, is there a way to delete the weird tags (the ones that duplicated succeeding 653 tags)? I'm thinking of merging the records and I would then have duplicates for the weird 653s. I see remove duplicate data but this remove duplicate data indeed? So a workflow probably of removing the whole data for duplicated data?
>
> Thank you very much!
>
> ________________________________________________________________________
>
> This message comes to you via MARCEDIT-L, a Listserv(R) list for technical and instructional support in MarcEdit. If you wish to communicate directly with the list owners, write to [log in to unmask] To unsubscribe, send a message "SIGNOFF MARCEDIT-L" to [log in to unmask]
________________________________________________________________________
This message comes to you via MARCEDIT-L, a Listserv(R) list for technical and instructional support in MarcEdit. If you wish to communicate directly with the list owners, write to [log in to unmask] To unsubscribe, send a message "SIGNOFF MARCEDIT-L" to [log in to unmask]
|
|
|