Many minerals have been proven in research studies to be essential for optimal growth, physiologic function and productivity in ruminants. Historically, testing for these minerals has been performed on diets or dietary components to ensure adequate concentrations of specific minerals in the diet. However, general mineral analysis does not identify the chemical forms of these minerals, which can dramatically alter their bioavailability and utilization.


Although not possible for some of the minerals, the most specific means of diagnosing a mineral deficiency is by testing animals for unique functional deficits or deficiencies of specific minerals containing proteins or enzymes. This type of testing is often impractical from a field perspective because of individual test costs or rigorous sample-handling requirements. But, when possible, this type of testing eliminates the need to know the specific molecular characteristics of a dietary mineral and the potential of competitive interactions with antagonistic minerals.

For minerals that do not have identified physiologic indices for which testing can be performed, direct quantification from animal tissues or serum may provide a reliable indication of the overall mineral status of the animal or herd.

Mineral deficiencies can be suggestively diagnosed by the development of clinical disease or by postmortem identification of tissue lesions. However, proof of deficiencies often requires analytical verification, since most do not have very unique clinical signs or lesions. In some instances, circumstantial proof of a deficiency can be provided by positive response to supplementation of a suspected deficient mineral. However, positive response may have nothing to do with the supplementation and may be just a time responsive correction of some other clinical condition.

An individual mineral may have multiple means of measurement for identification of deficiencies, but most have one more specific than the others. For example, dietary concentrations may or may not be reflective of the amount of bioavailable minerals, or an individual tissue concentration may or may not reflect functionally available mineral concentrations at the target or functional site.

Advertisement

The age of the animal being tested also is important for proper interpretation of mineral status. In addition, some minerals, for which little is provided in milk, accumulate at higher concentrations during gestation in order to provide neonates with adequate body reserves for survival until they begin foraging. This is especially prevalent with copper, iron, selenium and zinc. Thus, the normal range for these minerals in body tissue storage would be higher in early neonates than in an adult animal.

When individual animals are tested, the prior health status must be considered in interpreting mineral concentration of tissues. Disease states can shift mineral from tissues to serum or serum to tissues. For example, diarrhea can result in significant loss of sodium, potassium and calcium from the body; acidosis will cause electrolyte shifts between tissues and circulating blood.

It is known that infectious disease, stress, fever, endocrine dysfunction and trauma can alter both tissue and circulating serum or blood concentrations of certain minerals and electrolytes. Thus, evaluation of multiple animals is much more reflective of mineral status within a group than testing individual animals that are ill or have died from other disease conditions.

This [article] is directed at the animal testing side of diagnosing mineral deficiencies and provides a summary of the most commonly utilized tissues and fluids used for diagnosing specific mineral deficiencies in animals.

Live animal sampling
A variety of samples are available from live animals that can be analyzed for mineral content. The most common samples from live animals are serum and whole blood. These samples are adequate for measurement of several minerals, but it must be recognized that some disease states, as well as feeding times, can result in altered or fluctuating serum concentrations.

Other samples from live animals occasionally used for analyses include liver biopsies, urine and milk. However, since milk mineral content can vary through lactation, vary across lactations and be affected by disease, it is not typically used to evaluate mineral status. Furthermore, hydration status significantly affects urinary mineral concentrations, rendering it a poor sample for evaluation of mineral status.

Serum should be separated from the red-white blood cell clot within one to two hours of collection. If the serum sets on the clot for long periods of time, minerals that have higher intracellular content than serum can leach into the serum and falsely increase the serum content. Minerals for which this commonly occurs include potassium and zinc. In addition, hemolysis from both natural disease and collection technique can result in increased serum concentrations of iron, manganese, potassium, selenium and zinc.

The best type of collection tube for serum or whole blood is royal blue-top vaccutainer tubes, as they are trace-metal free. Typical red-top clot tubes will give abnormally increased results for zinc content, as a zinc-containing lubricant is commonly used on the rubber stoppers. For minerals other than zinc, serum samples from the typical red-top clot tubes are adequate. Similarly, serum separator tubes are typically adequate for mineral analyses, except for zinc. However, I also have found tin contamination in serum samples collected into some serum separator tubes.

Samples should be appropriately stored for preservation. Liver biopsies, urine and serum can be stored frozen long-term or refrigerated if analysis is to be completed within a few days. Whole blood and milk should be refrigerated but not frozen, as cell lysis or coagulation of solids will result.

Postmortem animal sampling
A variety of postmortem animal samples are available that can be analyzed for mineral content. The most common tissue analyzed for mineral content is liver, as it is the primary storage organ for many of the essential minerals. In addition, bone is used as the primary storage organ for calcium, phosphorous and magnesium. Other postmortem samples that can be beneficial in diagnosing mineral deficiencies include urine and ocular fluid.

Postmortem samples should be stored frozen until analyzed to prevent tissue degradation. If samples are to be analyzed within one to two days, they can be stored under refrigerated conditions.

Calcium
Analysis for calcium deficiency falls into two distinct classes. The first of these is metabolic calcium deficiency, often referred to as milk fever. The second is due to a true nutritional deficiency, which is associated with long-term dietary calcium deficits.

Analysis for metabolic calcium deficiency is aimed at detection of low systemic or circulating calcium content. In live animals, testing is performed on serum to determine circulating calcium content. In dead animals, testing is more difficult, as serum collected postmortem will not accurately reflect true serum calcium content prior to death.

However, circulating serum calcium content can be approximated from analysis of ocular fluid, with a vitreous-to-serum ratio of approximately 0.54. The Utah Veterinary Diagnostic Laboratory has been able to confirm and disprove the potential of clinical hypocalcemia in numerous postmortem cases via vitreous fluid analysis.

True, nutritional calcium deficiency is associated with weak, poor-doing animals that have swollen joints, lameness, weak bones and a propensity for broken bones. Analytical verification of calcium deficiency requires analysis of bone, since approximately 98 to 99 percent of the body calcium content is in bone, and serum concentrations are maintained by both diet and turnover of bone matrix. The bone analysis should be performed as fat-free, dry weight to remove the age variability of moisture and fat content.

Cobalt
Cobalt deficiency is associated with deficiency of vitamin B12 (cobalamin) in ruminants. Deficiency is associated with decreased feed intake, lowered feed conversion, reduced growth, weight loss, hepatic lipidosis, anemia, immunosuppression and impaired reproductive function. Cobalt deficiency can also lead to decreased copper retention in the liver.

Tissue and serum concentrations of cobalt are generally quite small, as the vitamin B12 is produced in the rumen by the microflora. Since cobalt concentrations may not truly reflect the B12 concentrations, the most appropriate analysis for cobalt deficiency is the direct quantification of serum or liver vitamin B12. However, there are numerous forms of cobalamins that ruminants produce with differing bioactivity, making interpretation of analytical results difficult.

Cobalamin is absorbed into circulation and small amounts are stored in the liver. Of the tissues available, the liver cobalt concentration best reflects the animal’s overall status, but it may not be truly reflective of vitamin B12 content.

Copper
Copper deficiency is a commonly encountered nutritional problem in ruminants, but copper excess is also commonly encountered. Clinical signs of deficiency can present a large array of adverse effects. Reduced growth rates, decreased feed conversion, abomasal ulcers, lameness, poor immune function, sudden death, achromotrichia and impaired reproductive function are commonly encountered with copper deficiency.

The best method for diagnosing copper status is via analysis of liver tissue, although much testing is performed on serum. Deficiency within a herd will result in some animals that have low serum copper concentrations, but serum content does not fall until liver copper is fairly depleted. In herds that have tested livers and found a high incidence of deficiency, it is not uncommon for a high percentage of the animals to have “normal” serum concentrations.

At the Utah Veterinary Diagnostic Laboratory, it is commonly recommended that 10 percent of a herd, or a minimum of 10 to 15 animals, be tested in order to have a higher probability of diagnosing a copper deficiency via serum quantification.

Even with herd deficiency, low serum copper concentrations may only be seen in 20 percent or more of the individuals. Herds that may be classified as marginally deficient based on liver testing may have predominantly normal serum copper concentrations. Thus, serum copper analysis should be viewed as a screening method only.

Another factor that can influence diagnosis of copper deficiency in serum is the presence of high serum molybdenum. As the copper-sulfur-molybdenum complex that forms is not physiologically available for tissue use, normal serum copper content in the presence of high serum molybdenum should always be considered suspect. In addition, the form of selenium supplementation can alter the normal range for interpretation of serum copper status, with selenite- supplemented cows having a lowered normal range for serum copper.

Copper deficiency can be diagnosed via analysis of copper-containing enzymes. The two most common enzymes utilized are ceruloplasmin and superoxide dismutase. Low concentrations of these enzymes in serum and whole blood, respectively, are diagnostic for copper deficiency. But, ceruloplasmin concentrations can increase with inflammatory disease. Higher costs for analysis of these enzymes than that of liver copper analysis often limits their utilization.

Excessive supplementation of copper in dairy cattle is a relatively common finding at the Utah Veterinary Diagnostic Laboratory. Liver copper concentrations greater than 200 parts per million (ppm) are routinely identified. In comparison, the recommended adequate liver copper concentration range in cattle is 25 to 100 ppm.

Iron
As an essential component of proteins involved in the electron transport chain and oxygen transport, iron is essential for normal cellular function of all cell types. Iron deficiency is associated with reduced growth, poor immune function, weakness and anemia. Although offspring are typically born with liver reserves of iron (providing the mother had adequate iron reserves), milk has low iron content which results in iron deficiency over time in animals fed a diet of milk only, as is the case in veal animals.

Both liver and serum concentrations are commonly utilized to diagnose iron deficiency. When using serum to measure iron content, samples that have evidence of hemolysis should not be used, as they will have artificially increased iron content from the ruptured red blood cells.

In addition, disease states can alter serum and liver iron concentrations as the body both tries to limit availability of iron to growing organisms and increases the availability of iron to the body’s immune cells. Thus, interpretation of iron status should be made with consideration of the overall health of the animal.

Other factors that can be used to assist with diagnosis of iron status include serum iron binding capacity, serum iron binding saturation, red blood cell count, packed cell volume, serum hemoglobin concentration and ferritin concentration. However, a variety of clinical conditions can cause these values to vary, including bacterial infections, viral infections, other types of inflammation, hemorrhage, bleeding disorders and immune mediated disorders.

Magnesium
Similar to calcium, analysis for magnesium deficiency falls into two distinct classes. The first is metabolic magnesium deficiency, often referred to as grass tetany. The second is due to a true nutritional deficiency, which is associated with long-term dietary magnesium deficits.

Analysis for metabolic magnesium deficiency is aimed at detection of low systemic or circulating content. In live animals, testing is performed on serum to determine circulating magnesium content. It must be noted that ruminants displaying recumbency or tetany may have normal serum magnesium, as tissue damage that occurs releases magnesium into the serum from the soft tissues. However, in dead animals testing is more difficult, as serum collected postmortem will not accurately reflect true serum magnesium content prior to death.

Circulating serum magnesium content can be approximated from analysis of ocular fluid, with a vitreous-to-serum ratio of 1.05. The Utah Veterinary Diagnostic Laboratory has been able to confirm clinical cases of hypomagnesemia in numerous postmortem cases via vitreous fluid analysis.

Urine is another postmortem sample that can be analyzed, since at times of low serum magnesium, the kidneys minimize magnesium loss in the urine.

True nutritional magnesium deficiency is not recognized in ruminants, except under experimental conditions. This syndrome is associated with weak animals that have weak bones, low bone ash and calcification of soft tissues. Analytical verification of true magnesium deficiency would require analysis of bone for verification, since approximately 70 percent of the body magnesium content is in bone. The bone analysis should be performed as fat-free, dry weight to remove the age variability of moisture and fat content.

Manganese
Manganese deficiency in ruminants is associated with impaired reproductive function, skeletal abnormalities in calves and less-than-optimal productivity. Cystic ovaries, silent heat, reduced conception rates and abortions are the typical reproductive effects. Calves that are manganese deficient can be weak, small and develop enlarged joints or limb deformities.

Manganese deficiency, although not reported often, is identified routinely in dairy cattle when tested. Of interest is the fact that most testing of beef cattle finds normal manganese concentrations in liver, blood and serum, but in these same matrices greater than 50, 75 and 95 percent, respectively, of dairy cattle tested are below recommended normal concentrations. This may, in part, be due to high calcium and phosphorous content of dairy rations, which can be antagonistic to the bioavailability of manganese.

Of the samples available, liver is the most indicative of whole body status, followed by whole blood and then serum. As red blood cells have higher manganese content than serum, hemolysis can result in increased serum content. Since the normal serum concentration of manganese is quite low, many laboratories do not offer this analysis because of inadequate sensitivity. Overall, response to supplementation has frequently been used as a means of verifying manganese deficiency, but it is critical a bioavailable form be utilized.

Phosphorous
Phosphorous status is somewhat difficult to measure in animal tissues. Serum and urine phosphorous concentrations can aid in diagnosing deficiency, but with mobilization of bone phosphorous to maintain serum content, significant drops in serum and urine may take weeks to develop. Serum phosphorous measurement should be as inorganic phosphorous for adequate interpretation. Longer- term phosphorous deficiency can be diagnosed postmortem by measuring bone or bone ash phosphorous content.

Dietary phosphorous or response to supplementation are better indicators of deficiency than tissue concentrations unless severe long-term deficiency has occurred.

The predominant effects of low dietary phosphorous are associated with diminished appetite and its resultant effects. Depressed feed intake, poor growth and weight loss are common with phosphorous-deficient diets. Longer-term phosphorous deficiency results in impaired reproductive performance, diminished immune function and bone abnormalities.

Potassium
Tissue concentrations of potassium poorly correlate with dietary status. Of the animal samples available, serum potassium is the best indicator of deficiency; but disease states can cause electrolyte shifts that result in lowered serum potassium when dietary deficiency has not occurred. In addition, serum that is hemolyzed or left on the clot too long may have falsely increased potassium content because of loss from the red blood cells. Thus, dietary potassium concentrations are a better guide to potassium status.

Dietary potassium deficiency affects intake, productivity, heart function and muscle function. Common clinical signs of severe potassium deficiency include diminished feed intake, reduced water intake, poor productivity, weakness and recumbency.

Selenium
As an essential mineral, selenium is commonly identified as deficient in ruminants. Selenium deficiency in ruminants is associated with adverse effects on growth, reproduction, immune system function, offspring and muscle tissues. White muscle disease, a necrosis and scarring of cardiac or skeletal muscle, is linked to severe selenium deficiency, although it can be caused by vitamin E deficiency as well. Reduced growth rates, poor immune function and impaired reproductive performance can be observed with less severe selenium deficiency.

Diagnosis of a deficiency can be made by analysis of liver, whole blood or serum for selenium content or by analysis of whole blood for activity of glutathione peroxidase, a selenium- dependent enzyme. The most specific analysis is that of whole-blood glutathione peroxidase, as it verifies true functional selenium status. Liver is the optimal tissue to analyze for selenium content, as it is a primary storage tissue.

With serum and whole-blood, the former better reflects recent intake, while the latter better reflects long-term status. Since seleno-proteins are incorporated into the red blood cells when they are made and the cells have a long half-life, selenium content is a reflection of intake over the previous months.

In order to adequately diagnose selenium deficiency, the dietary form of the selenium intake by the animals is important. Natural selenium, predominantly in the form of selenomethionine, is metabolized and incorporated into selenium-dependent proteins, but it can also be incorporated into nonspecific proteins in place of methionine.

Inorganic selenium is metabolized and only incorporated into selenium-dependent proteins. Thus, normal concentrations in serum and whole blood differ depending on whether the dietary selenium is a natural organic form or an inorganic supplement.

Sodium
Tissue concentrations of sodium poorly correlate with dietary deficiency. Of the animal samples available, serum and urine are the best for measuring sodium deficiency, but disease states can cause electrolyte shifts that result in lowered serum or urinary sodium even when dietary concentrations are adequate. Thus, dietary sodium concentrations are a better guide to diagnosing a deficiency.

Dietary sodium deficiency affects feed intake and productivity. Common clinical signs of severe sodium deficiency include diminished feed intake, reduced water intake and poor productivity.

Zinc
Zinc is an essential mineral required by all cells in animals. Zinc plays a role in numerous enzymatic reactions. Deficiencies of zinc are associated with reduced growth, poor immune function, diminished reproductive performance and poor offspring viability, as well as skin lesions in severe cases.

Tissue zinc concentrations do not reflect body status well. Of the common samples tested, liver and serum are the best indicators of zinc status. But serum and liver zinc can be altered by age, infectious diseases, trauma, fever and stress. It has been suggested that pancreas zinc content is the best means of truly identifying zinc deficiency.

Response to zinc supplementation has shown some animals having low-end normal liver or serum zinc can still show improvement in some clinical conditions. Thus, liver and serum only verify deficiency when these samples have very low zinc content.

Conclusions
A variety of samples can be tested for mineral content, but they may not provide any indication of the overall mineral status of the animal. Appropriate diagnosis of mineral status involves thorough evaluation of groups of animals. The evaluation should include a thorough health history, feeding history, supplementation history and analysis of several animals for their mineral status.

Dietary mineral evaluation should augment the mineral evaluation of animal groups. If minerals are deemed to be adequate in the diet but the animals are found to be deficient, antagonistic interactive effects of other minerals need to be investigated. As an example, high sulfur or iron can cause deficiencies in copper and selenium, even when there are adequate concentrations in the diet. PD

References omitted due to space but are available upon request.

—From 2006 Intermountain Nutrition Conference Proceedings