This is our old Q&A Site. Please post any new questions and answers at ask.wireshark.org.

Hello,

I'm trying to dump a very large tcp stream (600 MB) in raw format. I captured the stream and saved it to disk. Then I open it again and I want to open the "Follow TCP Stream" window to be able to save the stream (only the downwards direction) in raw format to my disk. So I click on "Follow TCP Stream", Filtering works fine but as soon as the Progress bar reaches 99%, memory consumption rises excessively until Wireshark crashes. As it only crashes after the filtering is almost complete, I imagine this is a problem with the GUI. (Loading 600 MB into the GUI might be very memory consuming).

So my question is: Is there a way to circumvent this crash? Is it maybe possible to dump the onedirectional raw stream directly into a file without using the "Follow TCP stream" window in the GUI? I checked command line options but I couln't find an appropriate function there.

I would appreaciate any help on that.

This question is marked "community wiki".

asked 20 Mar '11, 10:01

Welch13's gravatar image

Welch13
1112
accept rate: 0%


You can circumvent memory usage related crashes by cutting your very large trace file into smaller files using the command line tool "editcap" which is installed together with Wireshark. I usually use the "-c" parameter to cut my files into pieces of 100,000 frames each. Wireshark should have no problem with loading these.

Problem is, as soon as you cut your large file into pieces you probably won't like it because the stream you're trying to extract is spanning multiple files. For that kind of problem I usually create a batch file that calls another command line tool, "tshark", on each of them and use the parameters

-r <infile> -R "filter to extract the TCP flow by socket paramters" -w <outfile>.

Afterwards, I use the third command line tool mergecap -a to concatenate the extracted pieces of the flow, which usually is a lot smaller than the large file I started with.

permanent link

answered 20 Mar '11, 12:52

Jasper's gravatar image

Jasper ♦♦
23.8k551284
accept rate: 18%

Thanks for your answer, but does mergecap work on raw (binary) files ?? If so, how would I invoke that option?

permanent link

answered 20 Mar '11, 13:09

Welch13's gravatar image

Welch13
1112
accept rate: 0%

I guess by "raw" you mean it's binary, usually in pcap format - mergecap works on the same binary trace file formats like Wireshark. You can concatenate files like this:

mergecap -a -w <outfile> <infile1> <infile2> <infile>...<infile#>

permanent link

answered 20 Mar '11, 15:03

Jasper's gravatar image

Jasper ♦♦
23.8k551284
accept rate: 18%

Is all that 600MB of data really needed, i.e., all of it is part of the stream of interest? If not, you might try using as stringent a capture filter as possible to limit the number of packets you capture to begin with. If you already have a large capture file with lots of extraneous packets, you might try using tshark instead of Wireshark to do your filtering.

For more information on "Out of Memory" issues, see http://wiki.wireshark.org/KnownBugs/OutOfMemory.

permanent link

answered 21 Mar '11, 07:43

cmaynard's gravatar image

cmaynard ♦♦
9.4k1038142
accept rate: 20%

Yes, almost all of the 600 MB is actually needed. To clearify that: The capture file of interest is already filtered. It contains all the up- and downstream information from one single connection from one single IP in the network to one single IP on one port during 2 hours; I imagine the user watched some kind of flash video of a church service. And I want to reconstruct the downstream part of it. So filtering really isn't the issue. It's rather to extract the downstream part (upstream might only be 5% of the filesize) and write it into a binary file which can be used to reconstruct the full stream the application on the users computer saw at the time.

I'm still trying to get the proposed solution from Jasper working. In the meantime further comments and solutions are of course welcome.

permanent link

answered 21 Mar '11, 13:23

Welch13's gravatar image

Welch13
1112
accept rate: 0%

Well if your 600 MB file is already filtered to the one stream you want to extract my method won't help much - I was under the assumption that it contained irrelelevant data as well.

Maybe the cutting process with editcap can help to extract the payload of smaller trace files using the "Follow TCP stream" option and then it might be possible to concatenate the extracted content with "copy <contentfile1> /b + <contentfile2> /b + ... <contentfile#> <destinationfile> /b" (binary copy of files)

(21 Mar '11, 16:03) Jasper ♦♦
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "title")
  • image?![alt text](/path/img.jpg "title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Question tags:

×82
×24
×22
×20
×14

question asked: 20 Mar '11, 10:01

question was seen: 9,071 times

last updated: 21 Mar '11, 16:03

p​o​w​e​r​e​d by O​S​Q​A