Could not find an embedded ZIP in <IMAP message with UID #45>. Skipped.
Closed this issue · 13 comments
Hi there,
Been trying the IMAP feature but the script is having troubles with the emails I feed it.
In debug mode, I get errors such as the following:
The Current Message UID is: 45
--------------------------------
Subject: MimeType: text/plain
Could not find an embedded ZIP in <IMAP message with UID #45>. Skipped.
Moving (copy and delete) processed IMAP message file to IMAP folder: Mail server/DMARC/processed
Please find a slightly redacted version of a example email that fails. Seems to fail quite early on at the MIME::Parser stage as the subject isn't extracted.
Report Domain: unil.ch Submitter: ComUE Report-ID: unil.ch-1459977903@ComUE.eml.zip
I took a look at the Zip file you attached and also saw errors when running against the .eml inside it. However, I did find it worked when I changed the file name spaces to underscores:
./dmarcts-report-parser.pl -d test/Report_Domain\:_unil.ch_Submitter\:_ComUE_Report-ID\:_unil.ch-1459977903\@ComUE.eml
There are 1 messages to be processed.
--------------------------------
The Current Message is: test/Report_Domain:_unil.ch_Submitter:_ComUE_Report-ID:_unil.ch-1459977903@ComUE.eml
--------------------------------
Subject: Report Domain: unil.ch Submitter: ComUE Report-ID: unil.ch-1459977903@ComUE
MimeType: multipart/mixed
This is a multipart attachment
Skipped an unknown attachment
/tmp/msg-10598-2.zip
body is in /tmp/msg-10598-2.zip
serial 7411 single record
ip=193.49.115.59
I'm not sure if it entirely corresponds to what you're seeing when doing IMAP, but that does seem to be part of the problem. The content of the messages is seemingly parseable, but how it is getting into the script, in the first place, seems to be an issue.
Which IMAP server are you running?
Hi!
Thanks for taking a look. I'm connecting to an Exchange 2013 CU12 server.
I'm experiencing the same problem using direct imap connection. I'm using Outlook 365 online
Sadly, I don't have access to an Exchange server, so debugging will be difficult. I think the first step is making sure we can get similar results in #16 and hopefully make some progress from that end.
If someone is able to make progress on the Exchange angle, I'd be happy to take some patches or code updates!
I initially had this issue, when attempting to process a mailbox with miscellaneous mail in it. Not only did the parser fail, it also modified the timestamps in other messages. Ick.
Using a filter (cpanel) to move DMARC reports to their own folders and pointing the parser there solved this issue for me.
(Using IMAP)
I point the parser to a folder containing only DMARC reports and nothing else so this issue is not linked to the parser choking on random, non DMARC mail unfortunately.
laerm did you ever figure this one out in your environment? I am running into the same thing.
@laerm I figured it out. If you are using Exchange you may need to add a parameter to the instantiation of the new IMAPClient on line 166:
Password => $imappass,
IgnoreSizeErrors => 1)
# module uses eval, so we use $@ instead of $!
I added IgnoreSizeErrors because the documentation for this module told me to do so:
Ignoresizeerrors
Certain (caching) servers, like Exchange 2007, often report the wrong message size. Instead of chopping the message into a size that it fits the specified size, the reported size will be simply ignored when this parameter is set to 1.
http://search.cpan.org/~plobbes/Mail-IMAPClient-3.38/lib/Mail/IMAPClient.pod#Ignoresizeerrors
This might actually be a good pull request but I do not know what the downside of this parameter is if you are using other mail servers.
@key134 I've made that change a few days ago and haven't yet seen any ill effect on my setup. I only get 2-3 reports per day, though.
Cpanel/WHM server (Exim/dovecot).
@key134 That did it! Was able to parse my reports without issue! Well, nearly all of them, got a few duds reports that didn't parse which I'll investigate.
Thanks! :)
For me, we can close the issue. I'll let @techsneeze decide wether or not he wants to add this fix or not and close this issue.
@techsneeze close?