One thing I would suggest is to sniff frames with length of greater than
1500 bytes (just like Brian suggested), but also limit the capture size of
each packet to get only the headers. For example get the Ethernet Header
(14 bytes) + IP Header (20 bytes) + tcp header (20 bytes unless you have
options) + application headers. So you might set the capture limit to
something like 100 or 200 bytes (depending on you application headers).
That way you won't be dealing with very large captures files by eliminating
the payload portion but getting the important header information, while
still limiting it to Jumbo frames.
Hope that makes sense.
Tom Kacprzynski
On Fri, May 31, 2013 at 11:15 AM, Brian Dennis <bdennis_at_ine.com> wrote:
> You can use the less and greater options with tcpdump to limit the packet
> sizes (i.e. tcpdump less 1500)
>
> --
> Brian Dennis, CCIEx5 #2210 (R&S/ISP-Dial/Security/SP/Voice)
> bdennis_at_ine.com
>
> INE, Inc.
> http://www.INE.com <http://www.ine.com/>
>
>
>
>
> On 5/30/13 8:06 PM, "ramesh Kumar" <rameshkumar123321_at_yahoo.com> wrote:
>
> >Hello Folks,
> >
> >I truly understand this is not the right forum to ask this question but
> >just want to see if anyone has done this in the past.. I want to capture
> >jumbo frames in my network. If i run simple tcpdump on my sniffer it will
> >give a huge file in GB's which i cant open in wireshark. Is there an
> >option in tcpdump that it captures only packets more than 1500 size and
> >ignores the rest?
> >
> >Ramesh
> >
> >
> >Blogs and organic groups at http://www.ccie.net
> >
> >_______________________________________________________________________
> >Subscription information may be found at:
> >http://www.groupstudy.com/list/CCIELab.html
>
>
> Blogs and organic groups at http://www.ccie.net
>
> _______________________________________________________________________
> Subscription information may be found at:
> http://www.groupstudy.com/list/CCIELab.html
Blogs and organic groups at http://www.ccie.net
Received on Sat Jun 01 2013 - 08:33:36 ART
This archive was generated by hypermail 2.2.0 : Mon Jul 01 2013 - 06:58:42 ART