Volume 9, Issue 1

Battle of the Benzodiazepines: Comparison of Treatment Outcomes for Alcohol Withdrawal Syndrome: Lorazepam vs Chlordiazepoxide - A Literature Review
Original Research
Alcohol use disorder is a frequent occurrence within the United States, accounting for approximately 18% of the general population. Statistically, 50% of these people experience Alcohol Withdrawal Syndrome (AWS); a clinical diagnosis characterized by autonomic hyperactivity, following abrupt abstinence from heavy alcohol consumption. AWS is a life-threatening disorder that for many years has been treated with tapering doses of benzodiazepines— mostly chlordiazepoxide (CDE) and lorazepam (LOR). This paper seeks to answer the question “Are there better clinical outcomes when treating acute Alcohol Withdrawal Syndrome symptoms with chlordiazepoxide or lorazepam?”. A literature review was conducted to compile and analyze data from Randomized Clinical Trials (RCTs), and peer reviewed journal articles. These sources were carefully critiqued and compared for an overview to support the use of one benzodiazepine therapy over the other in AWS treatment. The revised Clinical Institute Withdrawal Assessment for Alcohol (CIWA-Ar) scale of alcohol withdrawal is the primary measure used to quantify and monitor improvement of AWS and treatment used. CIWA-Ar scalings, dosing regimens (length, doses, number of doses), days to resolution of symptoms, and adverse effects were compared across multiple studies to reach a conclusion. This paper concludes that the use of lorazepam can be more advantageous than traditionally accepted treatments due to its safety profile among patients with liver disease, promising abilities to decrease the time to complete resolution of symptoms, and a potentially easier transfer to sobriety.
American Journal of Clinical Medicine Research. 2021, 9(1), 33-35. DOI: 10.12691/ajcmr-9-1-8
Pub. Date: April 26, 2021
2297 Views2 Downloads
The Clinical Value of Neutrophil to Lymphocyte Ratio in the Diagnosis and Treatment of Crohn’s Disease
Original Research
Background: To evaluate the clinical value of neutrophil to lymphocyte ratio in the diagnosis and treatment of Crohn's disease. Methods: Between March 2018 and April 2020, patients diagnosed with CD at the First Affiliated Hospital of Nanjing Medical University were identified. A total of 128 patients with definite diagnosis, and 123 healthy people as the control group at the same time. The data of these patients were extracted retrospectively from their medical record. counts of white blood cells (WBC), counts of Neutrophils (NE), counts of lymphocyte (LY), hypersensitive C-reactive protein (hs-CRP), erythrocyte sedimentation rate (ESR) were recorded at the same time of colonoscopy. The IBM SPSS 20.0 software was used for the statistical analysis of data. Results: Levels of NLR in CD with ileocolon lesions were significantly higher than those in CD patients with ileum lesions (3.13 vs 2.72) (Z=-2.326, P=0.02). Levels of NLR in CD with colon lesions were higher than those in CD patients with ileocolon lesions (4.07 vs 3.13) (Z=-2.409, P=0.04). The levels of NLR, WBC, hs-CRP in activity stage were significantly higher than those in remission stage (P < 0.05), ESR levels in activity stage was higher than that in remission stage. the optimal cutoff point for the NLR level in order to predict the diseases was 2.18, and a highest AUC equal to 0.762 (0.700-0.823, P < 0.001). the optimal cutoff point for the WBC level in order to predict the diseases was 5.51 109/L, and a highest AUC equal to 0.634 (0.565-0.703, P < 0.05), the optimal cutoff point for the hs-CRP level in order to predict the diseases was 3.02 mg/L, and a highest AUC equal to 0.676 (0.606-0.746, P < 0.05), the optimal cutoff point for the ESR level in order to predict the diseases was 8.5 mm/h, and a highest AUC equal to 0.726 (0.660-0.793, P < 0.05). Conclusions: Neutrophil to lymphocyte ratio can be used as a sensitive and reliable noninvasive marker in the course of Crohn's disease.
American Journal of Clinical Medicine Research. 2021, 9(1), 29-32. DOI: 10.12691/ajcmr-9-1-7
Pub. Date: March 30, 2021
2462 Views8 Downloads
Factors Affecting the Main Operation Room Utilization Time at King Abdullah Medical City
Original Research
Background: Hospitals and operating room departments aim to improve quality and safety, as well as utilization and efficiency. Operating rooms are cost-intensive, multi professional parts of health care organizations. Thus, efficient usage of OR capacity is crucial. Aim of the study: The aim of the current study is to determine the factors affecting the main operation room utilization time at King Abdullah Medical City (KAMC). Subjects and Methods: A retrospective descriptive research design was utilized to achieve the aim of the study. Four hundred forty-eight days included in this study with no missing data. Data was collected from the operations registration office and the quality department at KAMC in Saudi Arabia in terms of data inventory and reliability and was analyzed by statistical package for social sciences (SPSS) version 24. Results: 94.2% of ICU beds post-operative were available. Also, 77.9 % of surgeon' attendance on time and 75.5% of the first cases started on time. Highly statistical significant difference between 2017 and 2018 in relation to predictor factors of operation room utilization time was found. Conclusion: Availability of intensive care unit beds, starting the first case start on time and surgeon' attendance on time are the predictor factors for operating room utilization time. Recommendations: Ministry of health should work as much as possible to provide enough intensive care beds for operation theatre. Also, surgeons should attend to work at operation room at the right time and they should start working with the first case at the right time. The staff should be motivated by different incentives to start the operation on time.
American Journal of Clinical Medicine Research. 2021, 9(1), 25-28. DOI: 10.12691/ajcmr-9-1-6
Pub. Date: March 15, 2021
3886 Views10 Downloads
Vitamin D deficiency and Risk Factors in Patients with Crohn’s Disease
Original Research
Background: To explore vitamin D(VD) levels in patients with Crohn's disease, and the correlation between VD levels and seasons, disease activity, lesion region, hormone therapy. To find the risk factors of VD deficiency and the role of VD in the pathogenesis and treatment of Crohn’s disease. Methods: Between March 2018 and December 2019, 86 patients diagnosed with CD at the First Affiliated Hospital of Nanjing Medical University were identified, and 86 healthy people were selected as the control group at the same time. VD, counts of white blood cells (WBC), hemoglobin (Hb), counts of platelet(PLT), C-reactive protein (CRP), erythrocyte sedimentation rate (ESR), albumin (ALB) levels were recorded at the same time of colonoscopy. Logistic regression analysis of the relationship between disease activity, lesion region, hormone therapy and vitamin D deficiency in patients with Crohn's disease, and analyze possible risk factors. Results: The levels of VD in patients with CD was significantly lower than that in the healthy controls (35.10 nmol/L vs 67.60 nmol/L), the difference was statistically significant (Z= -10.527, P<0.001). The summer autumn group was significantly higher than the winter spring group (z = -2.215, P = 0. 027). Patients with ileum lesions have a higher proportion of vitamin D deficiency than patients with non-ileum lesions. vitamin D deficiency rate of patients in activity stage is higher than that of patients in remission stage. With the increase of the degree of inflammation, the level of vitamin D decreased. Logistic regression analysis shows that platelet count > 250 × 109 /L, CRP ≥ 8 mg / L, ALB < 30 g/L and hormone therapy were risk factors for vitamin D deficiency(P<0.05). Conclusions: Patients with CD have low levels of VD, which is related to seasons. platelet count > 250 × 109 /L, CRP ≥ 8 mg / L, ALB < 30 g/L, and hormone therapy were risk factors for V D deficiency in patient with CD.
American Journal of Clinical Medicine Research. 2021, 9(1), 19-24. DOI: 10.12691/ajcmr-9-1-5
Pub. Date: February 01, 2021
2569 Views8 Downloads
Analysis of Common Pathogenic Bacteria and Drug Resistance of Biliary Tract Infection in Nanjing Area
Background: Analyze the distribution and drug resistance of pathogenic bacteria that cause biliary tract infections in Nanjing, and provide evidence for the rational use of antibacterial drugs in clinical practice. Methods: Clinical strains isolated from bile specimens of patients suspected of biliary infection in the First Affiliated Hospital of Nanjing Medical University in 2019 were collected, The drug susceptibility criteria are based on the standards published by the National Standardization Committee of the US Clinical Laboratories. WHONET 5.6 software was used to analyze the distribution of pathogens and drug resistance. Results: A total of 693 strains of pathogenic bacteria were isolated, including 448 Gram-negative bacteria(64.6%), 245 Gram-positive bacteria (35.4%). The top three pathogens were 210 strains of Escherichia coli(30.3%), 87 strains of enterococcus faecium (12.6%), 76 strains of klebsiella pneumoniae (11.0%), The resistance rates of Escherichia coli to ampicillin, cefuroxime, cefazolin, ceftriaxone, piperacillin and ampicillin / sulbactam were 80.1%, 69.4%, 67.3%, 64.1%, 63.6% and 62.8%, The resistance rates of Klebsiella pneumoniae to ampicillin / sulbactam, cefuroxime and cefazolin were 65.8%, 64.5% and 61.1%, The resistance rates of Enterobacter cloacae to ceftriaxone, ceftazidime and aztreonam were 56.2%, 53.1% and 53.1%, The resistance rates of Enterococcus faecium to moxifloxacin, clindamycin, erythromycin, penicillin G, ampicillin, ciprofloxacin and levofloxacin were 100%, 90%, 76%, 72.1%, 64.4%, 64% and 62%. Conclusions: Pathogens of biliary tract infections are mainly Enterobacteriaceae such as Escherichia coli and Klebsiella pneumoniae, followed by Enterococcus faecium and Enterococcus faecalis. There were many drug-resistant bacteria, so we should pay attention to bile specimen culture and drug sensitivity test.
American Journal of Clinical Medicine Research. 2021, 9(1), 14-18. DOI: 10.12691/ajcmr-9-1-4
Pub. Date: January 31, 2021
2393 Views9 Downloads
Hericium Erinaceus Polysaccharide Induced Maturation of Murine Bone Marrow Deprived Dendritic Cells
Original Research
Hericium erinaceus polysaccharide (HEP) is a Chinese herb and was reported to regulate the immune response effectively in different aspects. This study was aimed to investigate the influence of HEP on murine dendritic cells (DC). In this study, we found that HEP treatment induced DC maturation by the morphology and the up-regulation of co-stimulative molecules. The endocytosis activity was also reduced, which is also an important feature of maturation. HEP also increased the allogenic T cell proliferation in a MLR assay. Furthermore, it was found that HEP increase the expression level of IL-12. Finally, we concluded that HEP induced maturation of bone marrow DC.
American Journal of Clinical Medicine Research. 2021, 9(1), 10-13. DOI: 10.12691/ajcmr-9-1-3
Pub. Date: December 29, 2020
3181 Views6 Downloads
Sonographic Manifestation of Urinary Bilharziasis in School Children in Rahad, Sudan
Original Research
Background: Urinary bilharziasis is endemic to more than 70 countries, mostly in sub-saharan Africa, including Sudan. It poses a significant burden in terms of morbidity, economic and public health consequences. The disease is usually diagnosed clinically and by urine examination. Imaging plays an important role in demonstrating morbid anatomy and complications. Rahad town is located near a small fresh-water lake and is known for high prevalence of the disease. Objective: The aim of this study was to investigate the sonographic manifestations of urinary bilharziasis among school children in Rahad town in North Kordofan state, Sudan. Methods: Seventy-five school children, who complain of burning micturition, red urine and/or urgency were included in the study. At least 10 ml urine were collected from each child in a sterile, tightly closed container. Ultrasound scanning of the abdomen was carried out by a radiologist, using a 3.5 MHz abdominal probe and a portable ultrasound scanner. The liver, spleen, kidneys, ureters and urinary bladder were scanned and documented. The urine was examined by a senior pathologist, who recorded the finding of microscopic hematuria and/or bilharzial ova in each sample. Data were analyzed using a statistical package (PSPP) to calculate frequencies and mean values. Results: The study included 75 children in classes 1-4. Mean age was 9.36 years, range 7-14 years. Males were 59 (78.67%). More than one quarter (n= 20, 26.7%) have positive urine for S. hematobium ova. More than half (n= 42, 56%) have positive sonographic findings in the urinary bladder, including a third (n= 27, 36%) with bladder mucosal polyps. Only one child (1.3%) had dilated ureter and renal collecting system. Conclusion: Sonographic manifestations of urinary bilharziasis among school children in Rahad are mainly found in the urinary bladder, seen as wall thickening and irregularity, polyp formation and occasionally, calcification. Ultrasound could be used for mass screening and further follow up of urinary bilharziasis in children, as it can detect lesions even in patients with negative urine test for schistosomal ova.
American Journal of Clinical Medicine Research. 2021, 9(1), 6-9. DOI: 10.12691/ajcmr-9-1-2
Pub. Date: December 25, 2020
4590 Views23 Downloads
The Effect of Age on the Seasonal Prevalence of Hyponatremia in Emergency Patients--A Survey of 66,827 Patients from the Emergency Department
Original Research
Background: Hyponatremia is one of the most commonly encountered electrolyte disorders in the emergency department (ED). Seasonal fluctuations of the prevalence of hyponatremia has been reported. We investigated the influence of age on the seasonal prevalence of hyponatremia in the emergency department in China. Methods: Total of 66827 patients presented to the ED between January 2015 and December 2017 were reviewed. The adult group aged between 18 and 59 years old consisted of 36190 patients and the elderly group aged between 60 and 79 years old consisted of 22064 patients and very elderly group aged over 80 years consisted of 8573 patients. Information collected included age,sex,serum sodium and serum creatinine. Hyponatremia was defined as a serum level <135 mEq/L and severe hyponatremia was defined as a serum sodium level <125 mEq/L. Result: Prevalence of hyponatremia was significantly higher in the very elderly group than in the other two groups (30.14%, 22.24%, 15.33%, respectively). Similarly, the prevalence of severely hyponatremia was significantly higher in the very elderly group than in the other two groups (3.37%, 1.97%, 0.85% respectively). Prevalence of hyponatremia and severe hyponatremia was significantly higher in the very elderly group than in the other two groups in all seasons. In the elderly group and the very elderly group, there was a significant correlation between the high temperature weather during summer and prevalence of hyponatremia (r=0.6094, P=0.0354; r=0.6874, P =0.0135, respectively). Conclusion: The age plays a major role on the seasonal prevalence of hyponatremia and severe hyponatremia. Strategies to prevent hyponatremia and severe hyponatremia should be taken especially in the very elderly patients during summer.
American Journal of Clinical Medicine Research. 2021, 9(1), 1-5. DOI: 10.12691/ajcmr-9-1-1
Pub. Date: November 26, 2020
3669 Views210 Downloads