RE: RMON: Absolute vs delta

From: Brian McGahan (bmcgahan@internetworkexpert.com)
Date: Thu Jun 09 2005 - 17:17:18 GMT-3


Roy,

        Values that only increase (i.e. input errors) should use delta.
These values will never decrease, therefore you want to know the rate of
change (delta) over time, not the absolute value. Values that increase
and decrease (i.e. CPU utilization) should use absolute. In that case
you would want to know what absolute value is (i.e. 35% CPU utilization,
80% memory utilization, etc) and not the rate of change in these values.
You wouldn't care if the CPU utilization increased 10% in 5 minutes,
because you wouldn't know if it jumped up from 10% to 90% and then back
down to 20% in that 5 minute sampling period.

HTH,

Brian McGahan, CCIE #8593
bmcgahan@internetworkexpert.com

Internetwork Expert, Inc.
http://www.InternetworkExpert.com
Toll Free: 877-224-8987 x 705
Outside US: 775-826-4344 x 705
24/7 Support: http://forum.internetworkexpert.com
Live Chat: http://www.internetworkexpert.com/chat/

> -----Original Message-----
> From: nobody@groupstudy.com [mailto:nobody@groupstudy.com] On Behalf
Of
> Roy Dempsey
> Sent: Thursday, June 09, 2005 11:15 AM
> To: Cisco certification
> Subject: RMON: Absolute vs delta
>
> If a question asks to generate an RMON alert when the number of input
> errors on an interface exceeds 50 per second, should you use absolute
> or delta? This sounds like an absolute event, rather than a delta
> event.
>
> If it asked you to alert when the input errors *increased* by 50 per
> second I would have thought this was a delta event. However, I'm being
> told differently...
>
> Can anyone explain how you can differentiate between them?
>
> Thanks
> Roy
>
>



This archive was generated by hypermail 2.1.4 : Wed Jul 06 2005 - 14:43:41 GMT-3