This is our old Q&A Site. Please post any new questions and answers at ask.wireshark.org.

Is it possible to build a 10Gbps+ stream to disk capture appliance using WireShare software?

I already have the necessary hardware, but I haven't found any packet analysis software (free) capable of a capture with such a large amount of traffic. Does Wireshark include a stream to disk feature for this amount of traffic?

Any help or advice would be appreciated.

asked 09 Jul '12, 13:41

kfryklund's gravatar image

kfryklund
1334
accept rate: 0%


hard to implement with commodity hardware. Take a look at this paper:

http://www.net.t-labs.tu-berlin.de/papers/SWF-PCCH10GEE-07.pdf

Challenges are: disk speed and bus speed. The paper is from 2007 and some things have changed. PCI Express now offers sufficient bandwidth (via several lanes), but if you need to capture a fully utilized 10 gig link (full duplex), you still have to write approx. 2500 +/- MByte/s to disk at a sustained bit rate. That's not an easy task and probably requires special hardware. With a "software" sniffer/analyzer you have to copy data twice. Once from the network adapter to the RAM/CPU and then back to the disk subsystem.

Of course, if you don't need the full payload, or you don't want to capture all conversations, you can reduce the required bandwidth by any rate. The trick at 10 gig is good pre-filtering during the capturing stage. Otherwise you get in trouble with I/O bandwith and much more with loads of data to walk through during the analyzing stage.

There are commercial solutions available, that claim to be able to capture at full 10 Gig speed (and even 40 Gig). I have no idea how they do it. I never checked and I guess there is a lot of marketing involved ;-). They will face the same problems, bus speed, I/O subsystem speed, unless they use some special "hardware magic", like copying data directly from an intelligent network adapter to the disk subsystem. Some of them do what was suggested in the paper: split the 10 gig channel into several 1 Gig channels and capture those with several capturing units. Search google for: 10 gig capture.

Whatever you are going to build, you don't want to capture with Wireshark or tshark itself, as both are analyzing tools and not barebone capture tools. They build internal state in memory and that will be ways to much overhead for a live 10 gig capture. Use either dumpcap or tcpdump (or any other high speed capture tool like gulp, netsniff-ng or ringmap) to do the raw capture job and then you 'might' be able to use wireshark to analyze that huge pile of data.

The following blog post might be interesting for you:

http://www.lovemytool.com/blog/2012/02/when-the-shark-bites-by-mike-canney.html#more

Regards
Kurt

permanent link

answered 10 Jul '12, 02:32

Kurt%20Knochner's gravatar image

Kurt Knochner ♦
24.8k1039237
accept rate: 15%

Thank you Kurt, these are the exact kind of details I was looking for!

(10 Jul '12, 07:03) kfryklund

good to hear. Good luck with your endeavor.

(10 Jul '12, 07:23) Kurt Knochner ♦

Hi Kurt, I'm facing the problem of 1 GB/s capturing. But tcpdump is not really good due to a lot of packet dropped. So I use GULP as your suggestion. But unfortunately, the output cannot be read by wireshark. Do you have any idea of this issue? or could you please suggest any other tool for this 1 GB capturing which is working on 64 bit Linux system? Thank you very much :-)

(12 Jan '17, 18:25) hoangsonk49

This Web page for gulp seems to say that gulp writes pcap files, which Wireshark can definitely read. What happens if you try to read the file with Wireshark?

(12 Jan '17, 19:24) Guy Harris ♦♦

This may be the traffic recorder you seek: n2disk from the guys at ntop.

(12 Jan '17, 23:44) Jaap ♦

I got the problem of file damaged or corrupt, packet is bigger than the maximum of 65535.

(13 Jan '17, 00:13) hoangsonk49

I got the problem of file damaged or corrupt, packet is bigger than the maximum of 65535.

It sounds as if you're using an older version of Wireshark; 2.0 and later versions (and maybe even some older versions) support a maximum of 262144. What happens if you try to read the file with a newer version of Wireshark?

(13 Jan '17, 00:23) Guy Harris ♦♦

I use wireshark 1-10.1. This is the error message: "(pcap: File has 1610744840-byte packet, bigger than maximum of 262144)"

(13 Jan '17, 00:34) hoangsonk49

File has 1610744840-byte packet

That looks as if gulp is not writing out valid pcap files. There is, in fact, a bug in gulp that causes it to write invalid pcap files on 64-bit platforms.

(13 Jan '17, 00:39) Guy Harris ♦♦

I think so, do you know any version of Gulp which solve this problem of 64-bit platform ? Thank you

(13 Jan '17, 01:20) hoangsonk49

Or you might have damaged the file in transfer from the capturing device to the device running Wireshark. FTP doing an ascii transfer instead of a binary transfer will corrupt the data for instance.

(13 Jan '17, 01:31) SYN-bit ♦♦

Hi SYN-bit: I'm running on the same server, not using FTP, so I think it should not be the problem

(13 Jan '17, 01:47) hoangsonk49

I think so, do you know any version of Gulp which solve this problem of 64-bit platform ?

I sent Corey Satten a small patch to fix gulp; I don't think it was ever applied, however. You'd have to apply the patch and compile gulp.

(13 Jan '17, 01:51) Guy Harris ♦♦

Thank Harris, I received the answer from Corey. This is exactly the problem of 64-bit GULP

(15 Jan '17, 18:50) hoangsonk49
showing 5 of 14 show 9 more comments
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "title")
  • image?![alt text](/path/img.jpg "title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Question tags:

×549
×100
×82
×5
×4

question asked: 09 Jul '12, 13:41

question was seen: 8,229 times

last updated: 15 Jan '17, 18:50

p​o​w​e​r​e​d by O​S​Q​A