Abstract
Background: A Health Information System (HIS) is a system that integrates data collection, processing, reporting, and use of the information necessary for improving health service effectiveness and efficiency through better management at all levels of health services. Despite the credible use of HIS for evidence-based decision-making, countries with the highest burden of ill health and the most in need of accurate and timely data have the weakest HIS in the vast majority of world’s poorest countries. Although a Health Management Information System (HMIS) forms a backbone for strong health systems, most developing countries still face a challenge in strengthening routine HIS. The main focus of this study was to assess the current HIS performance and identify factors affecting data quality in a resource-limited setting, such as Ethiopian health facilities.
Methods: A cross-sectional study was conducted by using structured questionnaires in Dire Dawa Administration health facilities. All unit and/or department heads from all government health facilities were selected. The data was analysed using STATA version 11. Frequency and percentages were computed to present the descriptive findings. Association between variables was computed using binary logistic regression.
Results: Over all data quality was found to be 75.3% in unit and/or departments. Trained staff to fill format, decision based on supervisor directives and department heads seek feedback were significantly associated with data quality and their magnitudes were (AOR = 2.253, 95% CI [1.082, 4.692]), (AOR = 2.131, 95% CI [1.073, 4.233]) and (AOR = 2.481, 95% CI [1.262, 4.876]), respectively.
Conclusion: Overall data quality was found to be below the national expectation level. Low data quality was found at health posts compared to health centres and hospitals. There was also a shortage of assigned HIS personnel, separate HIS offices, and assigned budgets for HIS across all units and/or departments.
Introduction
A Health Management Information System (HMIS) is a system that integrates data collection, processing, reporting, and use of the information necessary for improving health service effectiveness and efficiency through better management at all levels of health services. Maintaining a good HMIS is an essential part in strengthening a health system (Chawla, Bansal & Indrayan 1997; World Health Organization Regional Office for the Western Pacific 1986).
In 2007, the World Health Assembly (WHA) passed a resolution on strengthening of Health Information Systems (HISs). The resolution acknowledges that sound information is critical in framing evidence-based health policy and decision-making. It is also fundamental for monitoring programs towards internationally agreed upon health-related development goals. Although a HMIS forms a backbone for strong health systems, most developing countries still face a challenge in strengthening routine HIS (USAID/Ministry of Health 2006; WHO 2008a).
In a good HMIS, data collection should be similar with the data requirements of users (only relevant data) and to the available processing capabilities; also the information generated should be simple to obtain and only the minimum required information must be collected, so that analysis can be done quickly. Feedback to the providers of the health data is an essential component of any reporting system (WHO 2004, 2008b).
The Ethiopian Federal Ministry of Health (FMOH) has emphasised the HMIS as a key component for successful implementation of the Health Sector Development Program (HSDP) strategic plan. The core health indicators come from routine health service and administrative records through HMIS and Monitoring and Evaluation (M&E) and are complementary processes standardising indicator definitions and data recording and reporting forms; integration of data from different programs into shared channels that improves health system efficiency and effectiveness (GAVI 2002; HMIS Reform Team 2007; TUTAPE 2009).
Research problem
The value of health information is determined by its utilisation in decision-making. Public health decision-making is critically dependent on the timely availability of sound data. Developing countries are reported to have a large amount of unreliable health data, poor human resources, and poor information technology infrastructure, hence effective HISs are needed to improve these problems. In Ethiopia data quality and utilisation of health information remains weak, particularly at primary health care facilities and district levels (Aqil, Lippeveled & Dairiku 2009; MoH 2010; Sahay 2001; WHO 2007). This research aims to answer the question whether or not governmental health facilities currently collect quality data and identify the independent determinant factors associated with the level of data quality.
Literature review
Description and evaluation framework of Health Information System
HISs have been variously described as the ‘foundation’ for better health. They are composed of inputs, processes, and outputs that are affected by determinants like organisational, technical, and behavioural factors which in turn affect health system performance and consequently lead to better health outcomes (Aqil et al. 2009; Lippeveld, Sauerborn & Bodart 2000).
The Health Metrics Network Framework (HMN) has two parts. These are the normative portion (components and standards) and an implementation portion (a roadmap) (WHO 2008a). The Performance of Routine Information System Management (PRISM) framework consists of tools to assess HIS performance and identify technical, behavioural, and organisational factors that affect HIS; aid in designing priority interventions to improve performance; and improve quality and use of health data (MEASURE Evaluation 2010).
Data quality
As originally proposed HIS performance is defined as improved data quality and continuous use of information (USAID/Ministry of Health 2006). Data quality is further described in four dimensions: consistency, completeness, timeliness, and accuracy. Completeness is measured not only as filling in all data elements in the facility report form, but also as the proportion of facilities reporting in an administrative area (province or district). Timeliness is assessed as submission of the reports by an accepted deadline. Accuracy is measured by comparing data between facility records and reports, and between facility reports and administrative area databases, respectively. Consistency is the degree of similarity of patient data on register and patient cards. Timeliness measures whether the health facility reports on the given time schedule to the next level (Lippeveld et al. 2000).
Organisational determinants
Governance, planning, availability of resources, training, supervision, finances, information distribution, and promotion of culture of information are organisational factors that can affect HIS performance (Aqil et al. 2009). Based on the proximity principle values related to organisational processes that emphasise data quality: use of HIS information, evidence-based decision-making, problem solving, feedback from staff and community, a sense of responsibility, and empowerment and accountability were chosen to measure the culture of information (Ajzen 2005). HMIS is crucial for HIS performance and is measured through availability of HIS vision statements and the establishment and maintenance of HIS support services, such as planning, training, supervision, human resources, logistics, and finance; so by identifying levels of support services, it is possible to prioritise for actions (Odhiambo Otieno 2005a).
Technical determinants
Technical determinant factors are related to the specialised know-how and technology to develop, manage, and improve HIS processes and performance. These factors refer to the development of indicators; designing data collection forms; and preparing procedural manuals, types of information technology and software development for data processing and analysis (Aqil et al. 2009). The PRISM framework assumes that if indicators are irrelevant, data collection forms are complex to fill in, and if computer software is not user-friendly, it will affect the confidence level and motivation of HIS implementers. Similarly, when software does not process data properly and in a timely manner the resulting analysis does not provide meaningful conclusions for decision-making, affecting the use of information (Aqil et al. 2009).
Behavioural determinants
Behavioural determinants like HIS users’ demand, confidence, motivation, and competence to perform HIS tasks, affect HIS processes and performance directly. How an individual feels about the utility or outcomes of a task or his confidence in performing that task, as well as the complexity of the task, all affect the likelihood of that task being performed (Hackman & Oldham 1980).
Previous literatures
Previous studies showed that data quality was poor in different resource-limited settings and different technical, behavioural, and organisational factors have been identified. In the early twenty-first century increasing evidence showed that routine information systems were not producing the intended results. Studies showed that data quality was poor in different settings like Mozambique and Kenya (Mavimbe, Braa & Bjune 2005; Odhiambo-Otieno 2005b). Similarly, use of information for planning and decision-making was found to be weak in Brazil (Da Silva & Laprega 2005). Many factors contributed to underperforming information systems, such as difficulty in calculating indicators because of poor choices for denominators in the Democratic Republic of Congo (Mapatano & Piripiri 2005). Also observed were errors in HIS reports; inadequacies in computerisation, human, and capital resources; and low management support in Kenya (Odhiambo-Otieno 2005b). In Tanzania, Nsubuga, Eseko and Tadesse (2002) found weaknesses in the areas of standardised case definitions, quality of reporting, analysis, supervision, and feedback.
A study from Uganda showed that there was low information use (24%), which was consistent with the limited observed skills level to interpret (41%) and use information (44%) (Aqil et al. 2008). Another study conducted on developments of HMIS in Uganda reported that during the past 5 years, timeliness of monthly reporting of outpatient data from the districts to the central level improved markedly from a national average of 21% in 2000, to 63% in 2002, and to 88% in 2004 (Peter, Miriam & Amos 2005). Limited knowledge of the usefulness of HIS data was found to be a major factor in low data quality and information use in Kenya (Odhiambo-Otieno 2005b).
A study conducted on utilisation of HIS in Jimma Zone by Sultan, Challi and Waju (2011) stated that 8 (26.7%), 57 (31.3%), and 54 (36.0%) units and/or departments of Health Posts, Health Centres, and District Offices, respectively, tried to change data into information, with cumulative (overall utilisation of health information) 32.9% units and/or departments of health facilities used their data/information for decision-making, planning, budget and M&E of their activities. There were also poorly coordinated processes and absence of capacity building activities reported.
Another study conducted in Bahr Dar by Helen T. (2011) on assessment of HMIS implementation reported that there was no incentive for information use or motivation to improve information culture. The same study reported that rules and regulations in the new HMIS were found to be low as 47.5% of the respondents lack confidence to participate and make decisions for HMIS-related activities; 65.7% of respondents lack appropriate technologies to utilise information; and the use of information for decision-making was found to be 45.6%, among them 35.3% used it for future reference and 42.4% used to observe trends, with 42.9% to pass report data to health office.
A report from the assessment done by Gebrekidan, Negus and Hajira (2012) on data quality and information use showed that a limited culture of using information for decision-making in planning and management of implementing programs was observed and only 37% of the facilities exercised discussion and made decisions using findings from routine health information. Similarly there was also inadequate supervision and feedback from senior levels to address the problems of inadequate documentation, late and incomplete reporting, and inaccurate reporting. These findings indicate the extent to which data quality can be adversely affected by limited investment in infrastructure and human resource capacity as well as by the performance of the data aggregation and reporting units of the system. The assessment also identified there was an observed assigned focal person for HMIS at 25 (78%) health facilities with 7 (28%) of the focal persons having information technology training. In addition, regular budget allocation for HMIS running costs were found at 7 (22%) of the health facilities.
A study conducted by Gashaw (2006) on utilisation of HMIS showed that the overall utilisation of HMIS was found to be 22.5% and training, standard data collection methods, data processing, and reporting were the major factors that affect use of information. The HMIS Task Force (2011) reported that although the available information system facilitates the decentralisation process, the information lacks quality and authenticity, and utilising information for decision-making and providing feedback to concerned parties was not yet a common practice.
An assessment was conducted on implementation of the new HMIS by the Ethiopian FMOH and HMN in 2009 on the four pioneer regions focusing on HMIS resources, data quality, information use, motivation of staffs, etc. They found that availability of the HMIS registers, forms, and tools was not up to the expectations and the availability of display charts recommended by the new HMIS and M&E guidelines were found to be less impressive. Similarly there was evidence of compliance with regular performance reviews and use of data for decision making. This assessment also showed that timeliness of reports varied from 67% to 100% for hospitals, from 86% to 100% for health centres, and from 75% to 86% for health posts, whereas completeness of reports ranged from 93% to 96% in hospitals, from 89% to 96% in health centres, and from 83% to 91% in health posts (Woldemariam et al. 2010).
Generally, HIS performance is assessed in terms of HIS input, process, and output and this performance is measured by availability of good data quality and continuous use of information. This in turn is affected by technical, behavioural, and organisational factors. The PRISM Conceptual framework presented in Figure 1 is based on the concept that HIS assessment is one of the HIS inputs which assesses its performance and identifies technical, behavioural, and organisational factors that affect data quality and information use which in turn affects the health system of a country. Based on the findings of this assessment HIS strategy will be developed, and according to the strategy, an intervention will be taken.
|
FIGURE 1: Performance of Routine Information System Management (PRISM) Conceptual Framework. |
|
Research methodology
This study was conducted in Dire Dawa Administration, Ethiopia, health facilities from March 1 to March 31 in 2013. Dire Dawa is one of the two chartered cities in Ethiopia (the other being the capital, Addis Ababa). Dire Dawa lies in the eastern part of Ethiopia which is 501 km away from Addis Ababa. The Dire Dawa Administration has 1 governmental hospital, 16 health centres, and 34 health posts. Except for the regional health bureau, it has no zonal or district health bureau.
Based on the 2007 census conducted by the Central Statistical Agency of Ethiopia (CSA), Dire Dawa has a total population of 342 827, of whom 171 930 were men and 170 897 women; 69.92% of the population are considered urban inhabitants, with an estimated area of 1231.20 km2.
Research design
The research used a facility-based cross-sectional study design i.e. study participants have been selected at a particular time point to assess the level of information utilisation and data quality and associated factors.
Respondents and sampling
The study population was all unit and/or department heads of hospitals, health centres, and health posts. Because all health facilities in the administration currently implement HMIS, all unit and/or department heads from all health facilities were included in the study. In Dire Dawa Administration, there are a total of 267 unit and/or department heads from all health facilities including health posts.
Data collection procedures, instrument and quality management
A face-to-face interview using a structured questionnaire was used to collect primary data among all unit and/or department heads of the health facilities. The questionnaire was adopted from the PRISM framework assessment tool version 3.1. This tool is useful to collect detailed information on the strengths and weaknesses of HIS in its input, process, and output and identifies factors affecting its performance. It was prepared in English, translated to Amharic and then back to English by another person to ensure consistency. Two health professionals who are members of HIS monitoring team were assigned as supervisors. Six health professionals who had basic HMIS training and had prior experience on data collection were assigned as data collectors. To maintain data quality during the data collection period, the two supervisors and the principal investigators performed the supervision of data collection procedures, checked every completed questionnaire, and gave onsite technical assistance to the data collectors.
Data analysis
The collected data was checked for completeness, and coded, entered, and cleaned using STATA version 11. Analysis of the data was done using the same package. Because all the variables were categorical frequencies, percentages were computed to present the descriptive analysis. Associations between the dependent and independent variables were computed using binary logistic regression. A p < 0.05 was considered as cut-off point for statistical significance.
To check whether the fitted model predicted well or not, the ROC curve was analysed and the Hosmer–Lemeshow test used to test overall goodness of fit. Multicollinearity in the variables was checked using Variance Inflation Factor (VIF). Interaction was also checked during the analysis.
Ethical consideration
Institutional ethical clearance was first sought from Mekelle University, College of Health Science. Data was collected after written consent from Dire Dawa regional health bureau. During the interview each participant was informed about the aim of the study. The interviewer discussed the issue of confidentiality and participants were informed that they had full right to refuse or discontinue participating in the research.
Research results
Descriptive analysis
Out of the total 239 respondents 188 (78.7%) were from 16 health centres, 28 (11.7%) were from health posts, and the remaining 23 (9.6%) were from one referral hospital. Of the total departments included on this study 25 (10.4%) were from an adult outpatient department, 12 (5%) were each from emergency, delivery, and antiretroviral therapy departments, 15 (6.3%) were from tuberculosis and leprosy departments, 10 (4.2%) from voluntary counselling and testing departments, and 15 (6.3%) from under 5 outpatient department (OPD) (Table 1).
TABLE 1: Distribution of units and/or departments heads of hospital, health centres and health posts in Dire Dawa Administration health facility, April 2013. |
Health Information System input
The majority of the respondents, 150 (62.7%) reported that there were no assigned HIS personnel and 154 (64.4%) reported there was no separate HIS office in their department. The majority, 195 (81.6%) department heads reported there was no specific budget assigned for HIS. Around 125 (52.3%) of the respondents also revealed there was no legislative, regulatory, and planning framework in their facility (Table 2).
TABLE 2: Health facility department’s HIS inputs in Dire Dawa Administration, April 2013. |
Health Information System process
One hundred ninety-one (79.9%) unit and/or department heads reported that they collect health data on a daily basis. The majority, 196 (82.0%) departments also keep patient registration and HIS monthly reports. Among them 137 (57.3%) revealed the records were easily accessible to their staffs. In addition, 164 (68.6%) heads also reported that they received directives in the last 3 months to check data accuracy, to fill formats completely, and to submit the monthly report on time. In this study 185 (77.4%) department heads claimed they submitted HIS reports on time (Table 3).
TABLE 3: Health facility department’s HIS process in Dire Dawa Administration, April 2013. |
Health Information System output
Compiling of HIS data and reports containing HIS information was reported by 170 (71.1%) and 162 (67.8%) department heads respectively. Display of key indicators was reported by 145 (60.7%) and quarterly and any other feedback reports were also available in 138 (57.7%) departments. Regarding the use of health information for decision making, 156 (65.3%) reported that they use information to make decisions. Among them 72 (46.2%) use the information for future reference; 66 (42.3%) observe trends of service delivery, and 18 (11.5%) pass reports for other subsidy health offices respectively (Table 4).
TABLE 4: Health facility department’s HIS output in Dire Dawa Administration, April 2013. |
Data quality was also determined based on the set criteria and compared by facility type. The analysis revealed that 78.3% of unit and/or departments in the referral hospital were assured data quality, 77% in health centres, and 64% in health posts. On average 75.3% unit and/or departments assured data quality (Figure 2).
|
FIGURE 2: Data quality by health facility type in Dire Dawa Administration, April 2013. |
|
Technical determinant characteristics for data quality
Health departments which had a standard set of indicators were 98% more likely to achieve data quality than departments without a standard set of indicators (COR=1.981, 95% CI [1.035, 3.792]). In addition, units and/or departments which had skilled human resource were 3.26 times more likely to achieve data quality than departments without skilled human resource (COR=3.260, 95% CI [1.742, 6.103]). Departments which had well-designed reporting formats were 2.38 times more likely to achieve data quality than departments without (COR=2.383, 95% CI [1.201, 4.728]). Similarly, departments which had trained staffs able to fill formats were 3.52 times more likely to achieve data quality (COR=3.521, 95% CI [1.799, 6.893]). Departments which had a friendly format for reporting were 2.25 times more likely to achieve data quality than departments without a friendly format, (COR=2.254, 95% CI [1.178, 4.311]). After adjusting with other variables, only trained staffs to fill out formats were found to be statistically significant (AOR=2.253, 95% CI [1.082, 4.692]). Hence, departments who had trained staffs to fill formats were 2.253 times more likely to achieve data quality than those departments without (Table 5).
TABLE 5: Associated technical factors for Data quality in all governmental health facilities in Dire Dawa Administration, April 2005 EC. |
Organisational and behavioural determinant characteristics for data quality
Health departments which base their decisions on supervisor directives were 3.26 times more likely to achieve data quality than those departments which did not base their decisions on supervisor directives (COR=3.260, 95% CI [1.742,6.103]). Health departments which had organisational culture were 2.28 times more likely to achieve data quality than departments without (COR=2.282, 95% CI [1.246, 4.178]). Health departments which based their decision on personal liking were 50% less likely to achieve data quality than departments which did not (COR = 0.505, 95% CI [0.267, 0.955]). Whereas departments which based their decisions on evidence and on considering cost were 3.47 and 2.02 times more likely to achieve data quality than departments which did not (COR=3.476, 95% CI [1.775, 6.806]) and (COR=2.020, 95% CI [1.080, 3.777]). Health managers who report on data accuracy regularly were 88% more likely to achieve data quality than those departments in which their managers did not report data accuracy (COR=1.885, 95% CI [1.018, 3.489]). Similarly health managers who seek feedback from supervisors were 3.49 times more likely to achieve data quality than those managers who did not seek feedback, (COR=3.495, 95% CI [1.859, 6.571]); however after adjusting these variables with other variables, only decisions based on supervisor directives and managers seeking feedback were found to be determinant factors for data quality. Hence departments whose decisions were based on supervisor directives were 2.15 times more likely to achieve data quality than those department which did not base their decision on supervisor directives, (AOR=2.131, 95% CI [1.073, 4.233]). Similarly health managers who sought feedback from senior supervisors were 2.54 times more likely to achieve data quality than those managers who did not seek feedback, (AOR=2.481, 95% CI [1.262, 4.876]) (Table 6).
TABLE 6: Associated organisational and behavioural determinant characteristics for data quality in all governmental health facilities in Dire Dawa Administration, April 2013. |
Discussion of findings
Based on PRISM framework using HIS performance diagnostic tools, this study assessed the current status of HIS performance at health facilities’ HIS input, process, and output, and identified possible determinants of technical, organisational, and behavioural factors for HIS data quality.
From the findings of this study 75% of units and/or departments reported that they had trained staffs and skilled human resources who were capable of performing HIS tasks. Only 37% of departments reported there were specifically assigned personnel for HIS activity. Similarly 35% of the facilities have separate HIS offices and 19% have assigned budgets for HIS. These finding were somewhat comparable with a similar study in Bahr-Dar where 45%, 43%, and 21% was reported for availability of HIS personnel, HIS offices, and budgets, respectively (Helen 2011); only 23.8% were reported for trained staff in North Gondar (Gashaw 2006). Regarding availability of HIS equipment, 63% had the necessary equipment. However, availability of coordination mechanisms to facilitate the use of HIS resources and presence of regulatory and planning frameworks to use HIS were found to be below 50%. This may be due to less concern was given to these issues by the majority of the facilities. Considering training on HIS activity, 53% responded for availability of training. It is known that continuous training on HIS activity is important to create awareness and to have trained staff and skilled human resources that are confident and motivated to perform HIS tasks. When compared to a similar study conducted in Jimma, Ethiopia, HIS training was below 50% (Sultan et al. 2011).
In order to check the accuracy of the data collected and report at the origin of data source, patient registrations and copies of HIS monthly reports should be kept. According to this study 90% of departments collect health data on daily activity and 82% keep patient registration and HIS monthly reports. These records were also easily accessible to staffs and easily retrieved in 67% of the departments. A similar study done in Jimma, Ethiopia, reported that all health departments collect data on daily activity and 73% keep their registration and monthly reports (Sultan et al. 2011). Whereas the study conducted in Bahr Dar, Ethiopia, revealed that only 77% collect data on daily activity. In this study more than 74% of departments had clear procedures for distributing and reporting the collected data, 85% put the data at administrative level and 75% used a set of criteria to verify completeness and consistency of data before reporting. Regarding availability of supervision, 69% of units and/or departments had received supervisor directives to check data accuracy, to fill format completely, and submit monthly reports on time. This was higher when compared with similar studies where availability of supervision was reported below 50% in Bahr-Dar and North Gonder (Helen 2011; Gashaw 2006). This may be because the majority of the health facilities were easily reachable for supervision.
Accurate, consistent, complete, and timely information is essential for public health decision-making and action-taking such as policymaking, planning, programming, and monitoring. In this study, 77.4% of department heads indicated that reports were submitted according to the schedule, which is between the 20th and 22nd of the month for health posts, and the 20th to 24th for health centres and hospitals. Eighty-two per cent of department heads who indicated reports were completely filled before reporting; and 78.7% of these reports were agreed to be consistent. A similar study in North Gonder showed only 50% HMIS reports were submitted timely and 96% of these reports were completely filled (Gashaw 2006). Consistency of reports in this study was slightly higher when compared to the study in Jimma where 62% of respondents claimed consistency of reports (Sultan et al. 2011). This increment may be because the majority of units and/or departments had basic HIS training, which in turn had skilled human resources to perform HIS tasks in improving data quality and information use. Another reason could be the availability of good supervision and feedback given by senior supervisors.
About 61% of the units and/or departments reported there were routine meetings for reviewing managerial and administrative matters. This was higher when compared to the assessment report on data quality and information use in selected health facilities (Helen 2011), where only 23.5% of facilities had routine review meetings. In this study, availability of incentives and policies for information use were found to be below 40%. A similar finding was reported on the study conducted in Bahr Dar where only 18.3% and 42.9% respectively were reported for availability of incentives and policies (Helen 2011).
Although the PRISM framework allows for identifying determinant factors for HIS utilisation and data quality, lack of similar studies conducted using this framework did not allow for the comparison of the identified determinant factors between different studies.
Regarding data quality, trained staff to fill format was found to be the predictor for data quality. Among possible organisational and behavioural determinants, decisions based on supervisor directives and managers seeking feedback were found to be determinant factors for data quality.
Conclusion and recommendations
The level of data quality was below the national standard in all health facilities which is below 80%. However hospitals and health centres have better performance compared to health posts. The factors which affect data quality were lack of training, lack of decision based on supervision, and lack of feedback. To improve the data quality at the health facility level managers should supervise and give feedback on time. In addition continuous training should to be provided to health care providers.
Acknowledgements
We are grateful to the research team who dedicated their full time and effort during data collection. We would like to thank the Dire Dawa Administration Health Bureau for cooperation in undertaking this research and also all department heads of hospital, health centres, and health posts for their participation and support in providing the required information for this research.
Competing interests
The authors declare that they have no financial or personal relationships which may have inappropriately influenced them in writing this article.
Authors’ contributions
K.T. (Drie Dawa City Administration Health Bureau), K.T. (Mekelle University), G.M. and W.T. conceived and designed the study. K.T. (Drie Dawa City Administration Health Bureau), K.T. (Mekelle University), G.M., W.T. performed the study and analysed the data.
References
Ajzen, I., 2005, ‘Laws of human behavior: Symmetry, compatibility, and attitude- behavior correspondence’. In A. Beauducel, B. Biehl, M. Bosniak, W. Conrad, G. Schönberger, & D. Wagener (Eds.), Multivariate research strategies (pp. 3–19). Shaker Verlag, Germany.
Aqil, A., Hotchkiss, D., Lippeveld, T., Mukooyo, E. & Asiimwe, S., 2008, Do the PRISM framework tools produce consistent and valid results? A Uganda study. Working Paper. National Information Resource Center, Ministry of Health, MEASURE Evaluation, Uganda, 14 March 2008.
Aqil, A., Lippeveled, T. & Dairiku, H., 2009, ‘PRISM framework: A paradigm shift for designing, strengthening and evaluating routine health information systems’, Health Policy and Planning 24(3), 217–228, viewed 08 December 2012, from http://www.qub.ac.uk/cite2write/harvard3l.html
Chawla, R., Bansal, A.K. & Indrayan, A., 1997, ‘Informatics technology in health care’, Nati Med 10(1), 31–50.
Da Silva, A.S. & Laprega, M.R., 2005, ‘Critical evaluation of the primary care information system (SIAB) and its implementation in Ribeiero Preto, Sau Paulo, Brazil’, Cadernos de Saude Publica 21, 1821–1828.
Gashaw, A., 2006, ‘Assessment of utilization of Health Information System at district level with particular emphasis to HIV/AIDS program in North Gonder’, Master’s thesis, Department of Community Health Medical Faculty, Addis Ababa University.
GAVI, 2002, Monitoring national immunization system using core indicators, WHO. Geneva.
Gebrekidan, M., Negus, W. & Hajira, M., 2012, ‘Data quality and information use: Asystematic review to improve evidence in Ethiopia’, African Health Monitor, March (14).
Hackman, J.R. & Oldham, G.R., 1980, Work redesign, Addison-Wesley, Reading, MA.
Helen, T., 2011, ‘Assessment of the Health Management Information System implementation status in public health facilities in Bahir-Dar city’, Master’s thesis, School of Information Science, Addis Ababa University.
HMIS Reform Team, 2007, Health management information system/M&E: Information use guidelines and display tools, Federal Ministry of Health, Addis Abeba.
HMIS Task Force, 2011, SNNP Regional Government. MoH. Addis Abeba.
Lippeveld, T., Sauerborn, R. & Bodart, C., 2000, Design and implementation of health information systems, WHO, Geneva.
Mapatano, M.A. & Piripiri, L., 2005, ‘Some common errors in health information system report (DR Congo)’, Sante´ Publique 17, 551–558. http://dx.doi.org/10.3917/spub.054.0551
Mavimbe, J.C., Braa, J. & Bjune, G., 2005, ‘Assessing immunization data quality from routine reports in Mozambique’, BMC Public Health 11, 108. http://dx.doi.org/10.1186/1471-2458-5-108
MEASURE Evaluation, 2010, Performance of Routine Information System Management Framework, 3.1 edn., USAID, Washington, DC.
MoH, 2010, Health Facility’s Revised Health Management Information System (HMIS) procedural manual, MoH, Kampala.
Nsubuga, P., Eseko, N. & Tadesse, W., 2002, ‘Structure and performance of infectious disease surveillance and response, United Republic of Tanzania, 1998’, Bulletin of the World Health Organization 80, 196–203.
Odhiambo-Otieno, G.W., 2005a, ‘Evaluation criteria for district health management information systems: Lessons from the Ministry of Health, Kenya’, International Journal of Medical Informatics 74, 31–38. http://dx.doi.org/10.1016/j.ijmedinf.2004.09.003
Odhiambo-Otieno, G.W., 2005b, ‘Evaluation of existing district health management information systems: A case study of the district health systems in Kenya’, International Journal of Medical Informatics 74, 733–744. http://dx.doi.org/10.1016/j.ijmedinf.2005.05.007
Peter, K., Miriam, N. & Amos, N., 2005, ‘Development of HMIS in poor countries: Uganda as case study’, Health Policy and Development Journal 3(1), 48–50.
Sahay, S., 2001, ‘Special issues on IT and health care in developing countries’, Electronic Journal of Information Systems in Developing Countries 5(1), 1–6.
Sultan, A., Challi, J. & Waju, B., 2011, ‘Utilization of Health Information System at district level in Jimma zone’, Ethiopian Journal of Health Science 21, 75–79.
TUTAPE, 2009, Tullen University supported HMIS Implementation program Training Manual, TUTAPE, Addis Ababa.
USAID/Ministry of Health, 2006, Rwanda Health Information System Assessment Report, RTI International, Rwanda.
WHO, 2004, Developing Health Management Information System a practical guide for developing countries, WHO, Geneva.
WHO, 2007, Assessment of Ethiopian National Health Information System, WHO, Geneva.
WHO, 2008a, A Framework and Standards for Country Health Information System Development, WHO, Geneva.
WHO, 2008b, Assessing the National Health Information System: An Assessment Tool version, 4, WHO, Genava.
WHO, 2008c, Healthmetrics, viewed 08 December 2012, from http://www.who.int/entity/healthmetrics/hmn
Woldemariam, H., Habtamu, T., Fekadu, N. & Habtamu, A., 2010, ‘Implementation of an integrated Health Management Information System and Monitoring and Evaluation (HMIS/M&E) system in Ethiopia: Progress and lessons from pioneering regions’, Quarterly health Bulletin 3(1), 48–52.
World Health Organization Regional Office for the Western Pacific, 1986, Workshops on the assessment and development of national Health Information Systems (HIS) and epidemiological surveillance: WHO, Geneva.
|