Looking to improve searches across large ( 7-10 GB files )
1
vote
4
answers
78
views
I have a need to search for many ( thousands) of dns within a large ( 7-10 GB) ldif file and print out the entries after it . I am mostly familiar with shell/python so tried to use awk for this and it works but it is a bit slow.
I use a counter when it finds a matching line - print everything under there and reset back to 0 if the line doesn't start with dn.
I can't use grep -A because each entry can contain different number of lines to be printed .
I don't have the means to import this into another ldap/db with indexing..just looking for something quick that I can use for now .
if anyone knows any Linux-fu that can help please let me know!
sorry I meant like the dn: pattern in an ldif file. For clarity - here is what I want
I have a file with -
dn: pattern1
phone: xxxx
email: yyyy
dn: pattern2
phone: xxxx
email: yyy
I have the dn: pattern that needs to be searched and the lines below that to be printed - till it reaches the next line with dn:
. There is one empty line between entries.
This is my code so far
awk -v dn="$dn" '
BEGIN { counter = 0 }
{
if ($0 ~ /^dn: /) {
if ($0 == dn) {
counter = 1
} else if (counter) {
exit
}
}
if (counter) {
print
}
}
'
Asked by Yas V
(11 rep)
Aug 2, 2024, 09:36 PM
Last activity: Sep 16, 2024, 03:00 PM
Last activity: Sep 16, 2024, 03:00 PM