| Title | Keywords | ||
|---|---|---|---|
| Author | Authorship | ||
| Corresponding Author | Funds | ||
| DOI | Column | ||
| Summary | |||
| Timeframe | - | ||
| Title | Keywords | ||
|---|---|---|---|
| Author | Authorship | ||
| Corresponding Author | Funds | ||
| DOI | Column | ||
| Summary | |||
| Timeframe | - | ||
Objective To understand the mold contamination status in the spices sold in Hunan Province, so as to provide scientific evidence for formulating cotamination control measures. Methods According to the National Food Safety Standard-Microbiological Examination of Food-Counting of Mold and Yeast(GB 4789.15-2016) and the National Manual for Monitoring the Risk of Food Contamination and Harmful Factors, mold counting was performed on 434 commercially available spices collected in Hunan Province from 2022 to 2023, and species identification of isolated molds was carried out using morphology and matrix assisted laser desorption ionization time-of-flight mass spectrometry(MALDI-TOF MS) technology. Results The detection rate of mold in market spice samples was 73.3%(318/434); fennel and five-spice powder had the highest detection rate( both equaling to 90.0%, 10/90), and star anise had the lowest detection rate(46.9%, 23/49). The average level of mold contamination was 4.2 ×104 CFU/g. There were no statistically significant differences in the detection rates among the spice samples collected in different packaging types and seasons( χ2=1.99 and 2.67, respectively; both P>0.05), but there was a significant statistical difference in the detection rate among the spice samples of different types( χ2=31.79, P<0.05). The spice samples were mainly contaminated by Aspergillus, Penicillium, Rhizopus,Lichtheimia, Mucor and Fusarium. Among them, fragrant leaves(66.7%, 12/18), chili peppers(61.4%, 51/83), sichuan peppercorns(54.8%, 23/42), and fennel(40.0%, 8/20) were most severely contaminated by Aspergillus; cumin was mainly contaminated by Aspergillus(56.5%, 35/62) and Rhizopus(17.7%, 11/62); peppers, star anise, cinnamon, grass fruit, and others spices were mainly contaminated by Aspergillus, which accounted for 65.4%(53/81), 43.5%(10/23), 35.9%(14/39), 33.3%(2/6), and 55.0%(22/40),respectively, and by Penicillium, which accounted for 9.9%(8/81), 30.4%(7/23), 35.9%(14/39), 33.3%(2/6) and 12.5%(5/40),respectively. Five-pice powder was mainly contaminated by Aspergillus(30.0%, 3/10), Rhizopus(20.9%, 2/10), and Lichtheimia(20.0%, 2/10). Conclusions There are mold contaminations in the market spice in Hunan Province. It is recommended to strengthen the monitoring and management of spices, thereby providing scientific basis for risk assessment and developing relevant standards.
Objective To understand the epidemiological characteristics of a varicella outbreak and vaccine protection efficiency in a school in Bao'an District, Shenzhen City, so as to provide a evidence for formulating effective varicella prevention and control measures. Methods Descriptive epidemiological methods were used to analyze the three-dimensional distribution characteristics and attack rate differences of the outbreak. A retrospective cohort study was conducted to analyze the protection efficiency of the vaccine with different doses and different vaccination durations. Results From March 6 to April 30, 2025, a total of 21 varicella cases occurred in the primary school department of a school in Bao'an District, including 20 students and 1 staff member. The cases were not isolated, and attending classes and participating in group activities during illness and household transmission were the main causes of the outbreak. The outbreak involved 4 classes, with attack rates of 30.4%(14/46) in Class 5(1), 6.4%(3/47) in Class 5(4), 4.1%(2/49) in Class 2(5), and 2.0%(1/50) in Class 1(2), and the differences were statistically significant(χ2= 26.476,P<0.05). The total protection efficiency of the varicella vaccine was 85.40%, with a protection efficiency of 91.70% for two doses and 71.90% for one dose. The differences in protection efficiency between different doses were statistically significant(χ2=29.428,P<0.05). The protection efficiencies of the vaccine for vaccination durations of < 5 years, 5-10 years, and >10 years were 95.60%,78.70%, and 53.80%, respectively. The protection efficiencies of < 5 years and 5-10 years were statistically significantly different from the unvaccinated group(χ2 values were 34.986 and 13.025, respectively; both P<0.05). Conclusions Failure to isolate cases in a timely manner, attending classes and participating in group activities during illness, and bidirectional household-school transmission can easily lead to the spread of varicella outbreaks. The protection efficiency of two-dose varicella vaccine is higher than that of one dose, and emergency vaccination is the best way to protect susceptible populations.
Objective To delineate the distribution of radiological resources and the frequency of medical radiation procedures in Luoyang City, Henan Province, thereby providing evidence for radiological health governance. Methods A survey on the basic situation and frequency of radiation diagnosis and treatment in Luoyang City in 2022 were conducted in all facilities. Using the questionnaire of Luoyang Radiation Health Monitoring Project Work Manual, the level of radiation diagnosis and treatment institutions, the number of on-duty employees, radiation workers, radiation diagnosis and treatment equipment, and the frequency of application of radiation diagnosis and treatment were investigated through on-site questionnaires, on-site information search, and on-site verification. Results In 2022, there were 328 radiation diagnosis and treatment institutions in Luoyang, 810 radiation diagnosis and treatment equipment, and 2 351 radiological workers, yielding 114 devices and 332 workers per million inhabitants.Primary-level or lower facilities constituted the largest group(82.0%, 269/328) and possessed the greatest share of equipment(44.2 %, 358/810), but only 24.8 %(582/2 351) of the workforce. The annual frequencies per 1 000 population were medical X-ray diagnosis 531.1, interventional radiology 4.9, radiotherapy 0.8, nuclear-medicine diagnosis 1.4, and nuclear-medicine treatment0.1. Conventional radiography accounted for 64.9% of all examinations, whereas CT comprised 31.4%, however, CT contributed the largest collective effective dose(7 654.19 person·Sv). Conclusions Luoyang City's radiation diagnosis and treatment resources and utilization frequency exceed the level-II medical care level but remain below level-I standards and the metrics of economically developed southern Chinese cities. The shortage of radiological staff in primary-level institutions and the rapid growth of CT diagnosis and treatment warrant optimized resource allocation and enhanced radiological protection measures focused on CT.
Objective To explore the clinical application value of combined detection of urinary retinol binding protein(URBP),urinary β-N-acetylglucosidase(UNAG), urinary β2-microglobulin(Uβ2-MG) and urinary creatinine(UCr) in the early diagnosis of lead poisoning renal injury. Methods A total of 86 patients with occupational lead poisoning admitted to the Third People's Hospital of Henan Province(Henan Hospital for Occupational Disease) from November 1, 2021 to October 31, 2025 were selected as the poisoning group, and 79 patients with occupational lead exposure history and urinary lead level of 25-69 μg/L during the same period were selected as the normal group. In addition, 86 healthy people with no history of occupational harmful chemical exposure and normal urinary lead level were selected as the control group. The levels of urinary lead, serum cystatin C(Cys-C),serum creatinine(Cr), uric acid(UA), URBP, UNAG, Uβ2-MG and UCr were detected and compared among the three groups. The correlation between URBP, UNAG, Uβ2-MG, UCr and urinary lead was analyzed. The diagnostic efficacy of the four indicators along and in combination for the detection of early renal injury in patients with occupational lead poisoning was evaluated using the receiver operating characteristic curve(ROC). Results There were no significant differences in the levels of Cys-C, BUN, Cr,UA and urinary lead among the poisoning group, the normal group and the control group(F=0.154, 0.555, 2.044 and 0.987,respectively; all P>0.05). The levels of URBP, UNAG, Uβ2-MG and UCr were higher in the poisoning group than those in the normal group, and were higher in the normal group than those in the control group, and the differences were all statistically significant(all P < 0.05). The levels of URBP, UNAG, Uβ2-MG and UCr were positively correlated with urinary lead(r=0.522, 0.472,0.655 and 0.876, respectively; all P <0.05). The ROC analysis results of URBP, UNAG, Uβ2-MG and UCr alone and in combination in the diagnosis of early renal injury induced by occupational lead poisoning showed that the area under the combined diagnosis curve was 0.967, the sensitivity was 96.52%, and the specificity was 93.01%, which were all higher than those for single indicator detection. The combined diagnosis could significantly improve the sensitivity and specificity of early renal injury in lead poisoning(P<0.05). Conclusions The combined detection of URBP, UNAG, Uβ2-MG and UCr can be used as an important detection indicator for early renal injury in occupational lead poisoning population, which provides a new research direction for clinical diagnosis and occupational disease prevention and treatment, and is worthy of promotion.
Objective To investigate and analyze the etiology and epidemiological characteristics of a cutaneous anthrax cluster in Yanshi District, Luoyang City, Henan Province, and propose targeted prevention and control recommendations, so as to provide scientific evidence for anthrax prevention and control. Methods Epidemiological investigations were conducted on suspected anthrax cases and at-risk individuals in Yanshi District, Luoyang City, in September 2024. Statistical analysis was performed on case data to describe the distribution characteristics of cases, and analyze exposure risk factors. Samples from cases, the environment, and infected livestock were collected for laboratory testing. Results A total of seven cutaneous anthrax cases were reported in this cluster, including four confirmed cases and three clinically diagnosed cases. All cases were associated with dead cattle in a certain livestock farm. Risk factor analysis revealed that processing contaminated beef from dead cattle(OR=17.11,95%CI:2.01-145.82, P<0.05) and the presence of skin lesions(Fisher's exact test, P<0.05) were significant risk factors for this cluster. Positive samples were detected by Bacillus anthracis polymerase chain reaction(PCR) in the samples collected from cases, environment and diseased livestock. Additionally, five isolates of Bacillus anthracis were identified; all isolates were identified as the canonical single nucleotide polymorphism(can SNP) genotype A.Br.001/002 and the 15-locus multiple-locus variable-number tandem repeat analysis(MLVA-15) genotype MLVA-CHN2. Conclusions The seven cases in this cluster were infected due to slaughtering or consuming dead cattle infected with Bacillus anthracis. Genomic tracing demonstrated that it was the predominant SNP genotype within China and identical to the historically prevalent strains in Henan Province.
Hepatocellular carcinoma(HCC) is a prevalent and highly lethal liver malignancy. The tumor microenvironment(TME)of HCC, characterized by local immunosuppression, functions as a dynamic-“ecosystem” that supports tumor progression.Composed of diverse parenchymal and stromal cells, TME critically regulates tumor cell proliferation, invasion, angiogenesis, and therapy resistance. Macrophages, as core components of the innate immune system, demonstrate remarkable heterogeneity and plasticity. Tumor-associated macrophages(TAM) represent a crucial immune cell subpopulation within the tumor niche, directly or indirectly facilitating tumor cell proliferation and survival, neovascularization, and immunosuppression. This review systematically summarizes the distinct roles and underlying mechanisms of TAM subtypes within the HCC immune microenvironment, aiming to provide new perspectives for developing immunotherapeutic strategies against HCC.
Objective To investigate risk factors for postoperative infection in patients undergoing radical gastrectomy for gastric cancer, providing evidence for optimizing perioperative prevention strategies. Methods A retrospective cohort study was conducted on 1 056 gastric cancer patients who underwent radical gastrectomy at a Grade A tertiary hospital in Gansu Province between 2021 and 2023. Patients were stratified into infection(n=105) and non-infection(n=951) groups. Potential risk factors were analyzed using univariate and multivariate logistic regression models. Results Multivariate analysis identified the following independent risk factors for postoperative infection: older age(OR=1.028, 95% CI: 1.001-1.056), body mass index(BMI)≥28(OR=2.319, 95%CI:1.044-5.152), diabetes mellitus(OR=2.293, 95%CI:1.047-5.023), tumor location(relative to cardia: gastric antrum OR=2.391,95% CI:1.152-4.962), intraoperative blood loss ≥100 m L(OR=1.874, 95% CI: 1.179-2.978), days of mechanical ventilator(OR=1.163, 95% CI:1.031-1.313), and days of central venous catheterization(OR=1.064, 95% CI:1.031-1.097). Laparoscopic surgery was a significant protective factor compared to open surgery(OR=0.107, 95% CI: 0.053-0.215). Distal gastrectomy(DG)was associated with a lower infection risk than total gastrectomy(TG)(OR=0.438, 95% CI: 0.254-0.754). Conclusions Postoperative infection risk is multifactorial, associated with patient comorbidities(age, obesity, diabetes), tumor location, surgical factors(approach, extent, blood loss), and invasive procedures. These findings underscore the need for enhanced perioperative management targeting modifiable risks to reduce postoperative infection risk.
Objective To analyze the epidemiological characteristics of Adverse Events Following Immunization(AEFI) in Luoyang City, Henan Province, from 2018 to 2024, evaluate vaccine safety, so as to provide references for enhancing AEFI surveillance. Methods AEFI cases data in Luoyang City from 2018 to 2024 were extracted from the National Healthcare Security Information System, and vaccination data were collected from the Henan Provincial Immunization Program Information Management System. Descriptive epidemiological analysis were conducted on the collected data. Results From 2018 to 2024, a total of 22 598 AEFI cases were reported in Luoyang City, with an reporting incidence of 63.64/105 vaccine doses administrated.The general reactions accounted for 98.76%( 22 317 cases), while had abnormal reactions accounted for 0.69%(157 cases). The male to female ratio was 1.06∶1. Children aged 0-<2 years accounted for 61.25%. The composition of severe and non-severe AEFI cases was differed across age groups( χ2=197.88, P<0.05). Most AEFI cases(89.41%, 23 888/26 716) occurred within the first day post-vaccination. The reported incidence of abnormal reactions was 0.44/105 doses. Among these, allergic reactions were the most common(56.69%, 89/157), with a reported incidence of 0.25/105 doses. Of the AEFI cases, 95.26%(21 527/22 598) had a final outcome of cure. Conclusions The safety profile of vaccines administrated in Luoyang City from 2018 to 2024 was satisfactory.However, the sensitivity of AEFI surveillance system could be further improved to ensure more comprehensive safety monitoring.
Objective To dynamically grasp the changes of iodine nutrition levels in key population before and after water improvement to reduce iodine in high iodine counties along Yellow River in Puyang City, Henan Province, and evaluate the implementation effect of prevention and control measures, so as to provide data support for scientifically adjusting intervention strategies. Methods In the three counties along the Yellow River in Puyang City, Puyang County, Fan County and Taiqian County,the data relevant to water-source high iodine monitoring in 2021(before the water change) and iodine deficiency disease monitoring between 2022-2024(one year after the water change, two years after the water change, and three years after the water change) were collected from the China Disease Control and Prevention Information System. The iodine contents of drinking water provided for the residents in the survey villages, household edible salt and urine in children aged 8-10 years and pregnant women were measured, besides the goiter prevalence in children were detected; the edible iodized salt coverage, the iodine nutritional status in children and pregnant women, and the goiter prevalence in children were compared and analyzed between before and one year, two years and three years after water improvement. Results The median iodine levels of drinking water for residents in the three counties along the Yellow River in Puyang City before water improvement, one year, two years and three years after water improvement were 173.50, 1.22, 0.81 and 0.80 μg/L, respectively. The edible iodized salt coverage rates in children were 27.53%(147/534), 33.22%(201/605), 57.50%(345/600) and 76.97%(468/608), respectively; the edible iodized salt coverage rates for pregnant women were 1.53%(2/131), 25.91%(64/247), 60.67%(182/300) and 84.88%(320/377), respectively; the differences across different years were statistically significant(χ2 values were 366.202 and 373.563, respectively; both P<0.05). Before water improvement, one year, two years and three years after water improvement, the median urinary iodine levels in children were 333.8,259.4, 198.0, and 203.1 μg/L, respectively, having statistically significant differences across the groups(H=212.189, P<0.05), and the median urinary iodine levels in pregnant women were 325.5, 214.1, 126.7, and 172.2 μg/L, respectively, also having statistically significant differences across the groups(H=181.710, P<0.05). Before water improvement, one year, two years and three years after water improvement, the thyroid enlargement rates in children were 7.12%(38/534), 4.71%(20/425), 2.00%(4/200), and 0.99%(6/608), respectively. The thyroid enlargement rates in children two and three years after water improvement decreased compared to before water improvement(χ2 values were 7.060 and 28.832, respectively, both P<0.05). The thyroid volume in the 8-year-old, 9-year-old and 10-year-old groups in three years after water improvement decreased compared to one year, two years after water improvement and before water improvement(t=52.853, 40.108, 25.485, respectively, and all P<0.05 for the 8-year-old group; t=100.490, 51.339, 34.273, respectively, and all P<0.05 for the 9-year-old group; for the 10-year-old group t=50.672, 40.518, 17.209, respectively, and all P<0.05). Conclusions After the water improvement and iodine reduction in high iodine counties along Yellow River in Puyang City, the iodine nutrition in children and pregnant women is suitable, and the thyroid enlargement rate of children has been reduced to normal levels. Improving water and reducing iodine is an effective measure to prevent and control the hazards of high iodine in water sources. However, pregnant women are at risk of iodine deficiency, and further efforts should be made to strengthen the supply of edible iodized salt and health education.
Objective To understand the survival and mortality status of HIV/AIDS cases in Zhangzhou City, Fujian Province, and analyze the relevant factors affecting case mortality, so as to provide evidence for scientifically adopting targeted intervention measures. Methods A retrospective cohort study was conducted and the relevant information of HIV/AIDS cases was collected in Zhangzhou City from 1996 to 2024. The life table method was used to describe the survival probability of the cases, and Kaplan-Meier method was used to estimate the average survival time of the cases. Log-Rank test was used to compare the differences between different groups, and Cox proportional hazards regression model was used to analyze the relevant influencing factors of survival time. Results A total of 2 213 HIV/AIDS cases were included in the observation, and 630 deaths from all causes occurred, with an average survival time of 16.323 years(95%CI: 15.930-16.716). The survival probabilities for the first, fifth,tenth, fifteenth and twentieth years were 81.03%, 70.64%, 64.44%, 61.14%, and 53.50%, respectively. The cases were divided into three periods based on the reporting time: 1996-2005, 2006-2015 and 2016-2024, and the survival time of reported cases in the three periods was 4.383 years, 13.533 years, and 17.447 years, respectively( χ2Log-rank=126.309, P<0.05). Being AIDS stage, not undergoing baseline CD4~+T lymphocyte count(CD4) testing, and not receiving antiretroviral therapy were all risk factors for case mortality(P<0.05). The mortality risk for AIDS stage was 2.207 times that for HIV stage(95%CI: 1.718-2.836), and the mortality risk for cases who had not received antiviral therapy was 17.976 times that for those who had received treatment(95% CI:13.949-23.164). The mortality risk for cases who did not undergo baseline CD4 testing was 2.046 times that for those in the CD40-<200 cells/μL group(95%CI: 1.489-2.810). Homosexual transmission, discovery through non-medical institutions and education level of college or above were protective factors for mortality risk(all P<0.05). The mortality risk for homosexual transmission was0.563 times that for heterosexual transmission(95%CI: 0.378-0.840), and the mortality death for non-medical institution cases was 0.556 times that for medical institutions(95%CI: 0.454-0.682). The mortality risk for cases with college education or above was 0.482 times that for those with illiterate/primary schools(95%CI: 0.260-0.895). Conclusions The survival time of HIV/AIDS cases are influenced by multiple factors. Early detection and antiretroviral treatment can effectively reduce the mortality risk of cases. Early testing, early detection, and early treatment are effective measures to reduce HIV/AIDS mortality.