I encountered the following message "Out Of Memory!" when opening a CAP file. The file size is 1,180,929 KB. The workstation I am using has an Intel Core i5 Processor, 4 GB Memory with Windows 7 32 Bit. I would like to find out if upgrading to a workstation with an Intel Core i7 Processor, 8 GB Memory with Windows 7 64 Bit will be able to overcome this issue. Thanks! asked 26 Jan '12, 00:57 Danny Chan |
2 Answers:
Have you checked to OutOfMemory wiki page already? You may want to look into using the latest Wireshark version and/or splitting your capture file using editcap. answered 26 Jan '12, 01:22 Jaap ♦ |
In theory, a 64-bit version of Windows and a 64-bit version of Wireshark (which requires an OS that supports 64-bit programs, meaning, for Windows, a 64-bit version) should help, as long as you have a big enough paging file. See this article from Microsoft on setting the paging file's maximum size (which would presumably default to 24 GB on the upgraded machine). However, bug 5979 in the Wireshark bug database says that 64-bit Wireshark doesn't appear to actually be 64-bit in a useful sense; we haven't yet figured out what the heck is going on there. 2GB is less than the default minimum page file size on any machine where you'd be likely to run 64-bit Windows, as that's 300MB more than the main memory size, so I doubt it's an issue with the paging file. (I think that, given what the Microsoft article says, you'd get warning dialogs from the system if you run out of paging file space.) answered 26 Jan '12, 23:08 Guy Harris ♦♦ I am running Win7 x64 SP1 with 8GB of memory, and I run into Wireshark out of memory conditions with even captures as small at 100MB (even using tshark!). I suppose that means I should donate to Wireshark for development efforts? ;) (17 May '13, 12:57) mlan No. You should use dumpcap instead of tShark/Wireshark. File size doesn't matter that much, because the out of memory is not caused by packet bytes but by the analysis structures Wireshark/tShark create in memory while capturing. dumpcap doesn't do this, it just writes packets to file. (17 May '13, 13:11) Jasper ♦♦ The "use dumpcap" rule isn't a general solution to all out-of-memory problems. It's a specific solution to the "I'm running {Wireshark, TShark} for a very long period of time, capturing traffic, and it runs out of memory" problem. The full solution is to run dumpcap in a mode where, instead of saving all packet data to a single file (which just defers the problem - when you try to read that big file with Wireshark or TShark, it'd still run out of memory), it saves to a sequence of files, keeping each file limited in size. If you're running out of memory with a 100MB file, that might just be a bug. At least on a 64-bit Mac, I've been able to read a gzipped file that's about 265MB (so it's even bigger uncompressed) - it took a while on the original machine I tried it on, which had only 4MB main memory, and it created a fair bit of extra swap space in the process, but it did work. (17 May '13, 13:30) Guy Harris ♦♦ I use dumpcap for my captures, but was trying to use tshark to filter a 100MB pcap created with dumpcap. That is when I experienced the OOM condition. Is it possible to use dumpcap to filter an existing pcap file? (17 May '13, 16:29) mlan
No, but if you can use a capture filter rather than a full-blown display filter, you can use tcpdump:
(17 May '13, 16:50) Guy Harris ♦♦ |
I totally agree with Jaap With files that big I would try Pilot.