Technology Description PRINT PAGE
Overview
Technology Description
Features
Benefits

Technology Description

The main reason for the complications associated with the correct calibration of infusion sets can be attributed to the fact that physicians’ scripts (Rx) are specified as X milliliter per hour. Infusion sets, on the other hand do not make provision for such a method of calibration. They are pre-calibrated as X drops per milliliter. This universal but totally conflicting method of calibration is therefore of a complicated procedure that needs to be repeated continuously.

IV™trend has been specifically designed to resolve the conflicting and complicated method of calibrating infusion, transfusion and oncology sets. It has been designed to:

  • Standardise on the method of calibration;
  • Eliminate incorrect calibration of the infusion device;
  • Ensure correct therapeutic concentration in the patient’s blood stream;
  • Prevent over- or under-infusion of the patient; and
  • Save the clinician’s and physician’s time on the recalibration process.

The extent of this problem was defined during a recent event study conducted at leading private hospitals. It was established that the correct calibration of infusion sets was simply not achieved. The calibration of infusion sets is a time consuming and complicated matter. Furthermore, due to the lack of a constant flow rate (when the IV™flow is not used), it becomes a process that needs to be managed continuously by the clinicians and nursing staff, increasing their already strained work load.

The following graphs illustrate the extent of this severe and universal problem associated with ensuring correct calibration in a typical hospital setup (note that the pre-scribed script [Rx] and intended flow rate was 125 ml/hr):

The individual flow rates (per infusion bag or vaculiter) as reflected in Graph 1 illustrate the wide variations from the prescribed script of 125 ml per hour (ranging from a minimum of 23 ml to a maximum of 500 ml per hour). The median of this data set is 88 ml per hour (a 30% deviation from the intended 125 ml per hour script).

The gold standard in IV-administration accuracy is set at 5% deviation as established by the electronic infusion pump systems. Only 7% of 125 ml per hour vaculiters achieved this level of accuracy and if the acceptable deviation was hypothetically expanded to 10%, only 12% of patients were infused at an acceptable rate.

From Graph 2 it is clear that under-infusion (72% of bags) is the norm with over-infusion occurring in 22% of vaculiters. Fluid overload can lead to pulmonary congestion and hyponatremia, the most common electrolyte abnormality in hospitalised patients. If the plasma sodium concentration declines to less than 120 mmol/liter in 48 hours, brain swelling might result in herniation, with devastating consequences.

In this study, 19% of patients were infused at a rate of more than 22% above the prescribed rate (a minimum fluid excess of 700 ml per 24 hours). Although a normal healthy subject can adjust physiologically, for a selected group of clinically compromised patients, this can be catastrophic.

The IV™trend ensures correct calibration of any infusion set and will ensure that such discrepancies will no longer occur in infusion care.