Aims: To investigate the effect of parenteral vitamin B12 supplementation on the growth rate of dairy heifer calves over the summer and autumn on seven farms from the Central Plateau of New Zealand, an area historically associated with low cobalt levels in grazing pasture.
Methods: This was a controlled clinical trial conducted on a convenience sample of seven farms with young female calves randomly assigned to three vitamin B12 treatment groups and followed through a grazing season. Two treatment groups received either monthly SC injections of a short-acting (SA) B12 formulation or 3-monthly injections of a long-acting (LA) B12 formulation and the third group received no treatment (NT). No additional parenteral vitamin B12 was given; however, all calves received additional cobalt (0.04-0.4 mg Co/kg liveweight) in the mineralised anthelmintic drenches given orally every month. Liveweight was recorded in December/January and at the end of the trial in May/June/July depending on farm. Pasture cobalt concentrations (mg/kg DM) were measured every month using 500-g herbage samples from 100-m transects in the area about to be grazed by the trial groups.
Results: There was evidence for a difference in growth rate between groups with mean final weight of 228 (95% CI = 212-243) kg for the LA groups, 224 (95% CI = 209-239) kg for the SA groups and 226 (95% CI = 211-241) kg for the NT groups respectively, (global p-value = 0.014). Calves given SA vitamin B12 were 3.77 (95% CI = 0.71-6.82) kg lighter than calves given LA vitamin B12 (p = 0.011). There was no evidence for a change in pasture cobalt concentrations (p = 0.32).
Conclusions and clinical relevance: The results of this trial raise the question as to whether the routine use of vitamin B12 supplementation in young cattle from areas traditionally thought to be cobalt deficient is necessary, and further raise the possibility that vitamin B12 supplementation by repeated injection of SA products may negatively impact growth rates.
Aim: To biomechanically compare the bending stiffness, strength, and cyclic fatigue of titanium additively manufactured (AM) and conventionally manufactured (CM) limited contact plates (LCP) of equivalent dimensions using plate-screw constructs.
Methods: Twenty-four 1.5/2.0-mm plate constructs (CM: n = 12; AM: n = 12) were placed under 4-point bending conditions. Data were collected during quasi-static single cycle to failure and cyclic fatigue testing until implants plastically deformed or failed. Bending stiffness, bending structural stiffness, and bending strength were determined from load-displacement curves. Fatigue life was determined as number of cycles to failure. Median test variables for each method were compared using the Wilcoxon rank sum test within each group. Fatigue data was also analysed by the Kaplan-Meier estimator of survival function.
Results: There was no evidence for a difference in bending stiffness and bending structural stiffness between AM and CM constructs. However, AM constructs exhibited greater bending strength (median 3.07 (min 3.0, max 3.4) Nm) under quasi-static 4-point bending than the CM constructs (median 2.57 (min 2.5, max 2.6) Nm, p = 0.006). Number of cycles to failure under dynamic 4-point bending was higher for the CM constructs (median 164,272 (min 73,557, max 250,000) cycles) than the AM constructs (median 18,704 (min 14,427, max 33,228) cycles; p = 0.02). Survival analysis showed that 50% of AM plates failed by 18,842 cycles, while 50% CM plates failed by 78,543 cycles.
Conclusion and clinical relevance: Additively manufactured titanium implants, printed to replicate a conventional titanium orthopaedic plate, were more prone to failure in a shorter fatigue period despite being stronger in single cycle to failure. Patient-specific implants made using this process may be brittle and therefore not comparable to CM orthopaedic implants. Careful selection of their use on a case/patient-specific basis is recommended.
Case history: In mid-summer (February), 42 of a flock of 68 ram hoggets (approximately 5 months of age) and two of a group of 14 alpacas on a farm in the Manawatū region of New Zealand were found recumbent or dead following a period of persistent rain, strong winds and relatively low temperatures. The hoggets and alpacas had been shorn 4 and 53 days previously, respectively, and were in adequate to good body condition with access to ad libitum pasture. Post-mortem and histological examinations were undertaken on four hoggets and two alpacas.
Clinical findings: Apart from hypothermic body temperatures from four recumbent hoggets, nothing of significance was identified on clinical or gross pathological examination. Histological changes of vacuolar hepatopathy, renal tubular degeneration and pulmonary congestion were present in all animals examined.
Diagnosis: Based on the history and clinical and pathological findings, hypothermia was highly probable to have been the cause of the deaths.
Clinical relevance: These cases emphasise the importance of shelter for recently shorn sheep and alpacas regardless of the season.
Aims: To evaluate, in a pasture-based dairy herd, the response to a three-time point hoof trimming regime on lameness incidence and time from calving to observation of an elevated locomotion score (LS).
Methods: This study was conducted on a 940-cow spring-calving herd in New Zealand's North Island between May 2018 and May 2019. Cows (n = 250) were randomly allocated to the hoof trimming group, with the remainder assigned to the non-trim cohort. One trained professional hoof trimmer used the five-step Dutch method to trim the hind feet of the trimming group. Throughout the subsequent production season, the whole herd was locomotion-scored fortnightly using the 4-point (0-3) Dairy NZ lameness score. Kaplan-Meier survival curves were used to assess the univariable effect of trimming on the interval between calving and first LS of ≥ 2 and first LS ≥ 1. A multivariable Cox proportional hazards regression was used to further evaluate the effect of trimming on time to elevated LS.
Results: Mean lameness (LS ≥ 2) prevalence was 2.6%, with 30% of cows having ≥ 4 observations during the study period when at least one LS was ≥ 2. For LS ≥ 1, mean prevalence was 40%, with 98.6% of cows having ≥ 4 observations during the study period when at least one LS was ≥ 1 during lactation. Hoof trimming had no apparent effect on the incidence of clinical lameness (LS ≥ 2) (trimmed vs. non-trimmed: 33.2% vs. 28.8%, respectively), but for LS ≥ 1, there was a small decrease in the incidence of LS ≥ 1 (trimmed vs. non-trimmed: 96.9% vs. 99.3%, respectively). The hazard of a cow having a first observed LS ≥ 2 in the control group was 0.87 (95% CI = 0.66-1.14) times that of the trimmed group; however, the hazard of a cow having a first LS ≥ 1 was 1.60 (95% CI = 1.37-1.88) times higher in the control than in the trimmed group.
Conclusion and clinical relevance: On this farm, prophylactic hoof trimming had no clinically relevant impact on the incidence of clinical lameness and was not associated with clinically beneficial reductions in time to first observed LS ≥ 2. This may be because claw horn imbalance was not pronounced on this farm, with 53% of cows needing no trim on either hind limb on the first trimming occasion. Further research on the response to prophylactic trimming in pasture-based dairy cattle is required.
Aims: To compare the retention by New Zealand dairy cows kept at pasture in a lame cow group, of three hoof block products commonly used in the remediation of lameness.
Methods: Sixty-seven farmer-presented Friesian and Friesian x Jersey dairy cows from a single herd in the Manawatū region (New Zealand) suffering from unilateral hind limb lameness attributable to a claw horn lesion (CHL) were randomly allocated to one of three treatments: foam block (FB), plastic shoe (PS) and a standard wooden block (WB). Blocks were applied to the contralateral healthy claw and checked daily by the farm staff (present/not present) and date of loss was recorded. Blocks were reassessed on Day 14 and Day 28 and then removed unless further elevation was indicated. Daily walking distances were calculated using a farm map and measurement software. Statistical analyses included a linear marginal model for distance walked until block loss and a Cox regression model for the relative hazard of a block being lost.
Results: Random allocation meant that differences between products in proportion used on left or right hind foot or lateral or medial claw were small. Mean distance walked/cow/day on farm tracks whilst the block was present was 0.32 (min 0.12, max 0.45) km/day; no biologically important difference between products in the mean distance walked was identified. Compared to PS, cows in the WB group were five times more likely to lose the block (HR = 4.8 (95% CI = 1.8-12.4)), while cows in the FB group were 9.5 times more likely to lose the block (HR = 9.5 (95% CI = 3.6-24.4)).
Conclusions: In this study, PS were retained for much longer than either FB or WB. As cows were managed in a lame cow group for the study duration, walking distances were low and did not impact on the risk of block loss. More data are needed to define ideal block retention time.
Clinical relevance: In cows with CHL the choice of block could be based on the type of lesion present and the expected re-epithelisation times.
Aims: To compare intraocular pressure (IOP) measurements obtained in rabbits using rebound (TV) and applanation (TPV) tonometers with four different methods of physical restraint.
Methods: A total of 20 New Zealand White rabbits (40 eyes) were included in this study. IOP readings were obtained from both eyes using the two different tonometers. The rabbits were placed on a table and restrained by wrapping in a cloth (Method I), by scruffing with rear support (Method II), by wrapping in a cloth and cupped in the hands (Method III), or by a box restrainer (Method IV).
Results: The mean IOP measurement obtained by TPV was higher than that obtained with the TV for all handling methods. Mean differences (TV-TPV, in mmHg) in IOP were -5.3 (95% Cl = -6.5 to -4.1) for Method 1, -4.7 (95% Cl = -6.2 to -3.29) for Method II, -4.9 (95% Cl = -6.2 to -3.7) for Method III and -7.6 (95% Cl = -9.2 to -5.9) for Method IV. Using the TV tonometer, mean IOP for Method IV was higher than for Method I (mean difference 2.1 (95% Cl = 1.1-3.1)), whereas using the TPV tonometer, mean IOP for Method IV was significantly higher than Method I, II, and III (mean differences: 4.4 (95% Cl = 2.6-5.9), 3.7 (95% Cl = 2-5.3) and 3.8 (95% Cl = 2-5.4), respectively). According to Bland-Altman plots, IOP readings for TPV tended to be higher than those for TV with all handling methods, but with a lack of agreement. The mean difference and 95% limits of agreement for the differences between TV and TPV were -5.4 mmHg (-12.5-1.9 mmHg), -4.7 mmHg (-12.9-3.5 mmHg), -4.9 mmHg (-12-2.2 mmHg), and -7.5 mmHg (-17.4-2.3 mmHg), with Methods I, II, III, and IV, respectively. Comparing TV and TPV, only 7.5%, 12.5%, 27.5%, and 15% of IOP measurements from 20 rabbits were within the range considered clinically acceptable for IOP (± 2 mmHg) for Method I, II, III, and IV, respectively.
Conclusion and clinical relevance: In conclusion, the physical restraint method should be recorded when IOP is measured in rabbits, and TV and TPV tonometers cannot be used interchangeably (high bias and low proportion of measurements within ± 2 mmHg).
Aims: To evaluate the effect of IM administration of three sedative drugs, acepromazine, alfaxalone and dexmedetomidine, in combination with morphine, on the size of the feline spleen using ultrasonography.
Methods: Twenty-four client-owned cats undergoing elective de-sexing or minor procedures were recruited for a focused ultrasonographic examination of the spleen prior to and at 10, 20 and 30 minutes following administration of one of three randomly assigned IM sedation protocols: 0.05 mg/kg acepromazine (ACE group), 3 mg/kg alfaxalone (ALF group), or 10 μg/kg dexmedetomidine (DEX group), in combination with 0.5 mg/kg morphine. B-mode images of the spleen were collected and measured following a standardised protocol. Cardiorespiratory parameters and sedation score were also recorded. Mean thickness of the head, body and tail of the spleen for each group at 10, 20 and 30 minutes after drug administration was compared to baseline.
Results: Mean splenic thickness increased over time in the ACE group (thickness of body at T0 = 8.9 (SE 2.1) mm and at T30 = 10.5 (SE 2.0) mm; p = 0.001) and the ALF group (thickness of body at T0 = 8.8 (SE 1.0) mm and at T30 = 10.3 (SE 1.7) mm; p = 0.022) but not in the DEX group (thickness of body at T0 = 8.6 mm (1.2) and at T30 = 8.9 mm (0.6); p = 0.67). Mean arterial blood pressure in the DEX group was significantly higher than in the other groups (p = 0.002). Sedation scores in the DEX group were consistently high for the entire period. However, the sedation score in the ACE group increased over 30 minutes (p = 0.007). Sedation score in the ALF group was highest at 10 minutes but gradually decreased over the following 20 minutes (p = 0.003).
Conclusions: Sedation with IM dexmedetomidine and morphine did not change splenic size, whereas acepromazine or alfaxalone and morphine increased it regardless of the degree of sedation.
Clinical relevance: Where splenomegaly is identified in a cat sedated with acepromazine or alfaxalone, the effects of the sedation protocol could be considered as a possible cause.