It is almost hard to believe that our GIEP internship experience is coming to an end in one week. After spending time working with iKure over the past month and a half—and over three months developing our project with them from Ann Arbor—we have definitely come to know both the employees at the organization, and the ins and outs of their operations, both positive and not-so-positive.
Over the last month, our team has pushed relentlessly to have iKure staff arrange field visits to rural communities to get information and feedback from health workers on our application. We only had three 4-hour visits in this time period and interacted with two iKure Community Health Workers and four ASHAs. That was essentially all we were allowed for the formative evaluation process that needed to be done to carry our project forward.
However, we learned vastly more information from these visits than we had in the last four months, in which we tried to gather knowledge about health workers from the iKure project manager and operations manager, who only go out into these communities every once in a while.
We no longer have the time to conduct another field visit to help inform the design of our mobile application. However, what we definitely will finish by the time we leave Kolkata is a detailed documentation “packet” that contains evaluation reports on the field testing we did, an application style guide to standardize iKure’s existing technology frameworks, and a recommendations manual to help iKure move forward sustainably and appropriately in the mHealth sphere.
As we continue to write these documents, we are not only considering the health workers’ contexts that could shape future trainings and usage of the KOL-Health Application (and potentially other electronic health tools), but also the aspects of iKure’s system that can be improved to better support successful implementation and evaluation of technology in community health work.
Our project over the course of the last five months has been somewhat the same, but different altogether:
We are still looking at maternal-child health and how to integrate an MCH program of sorts into iKure’s framework. However, rather than helping guide health worker training in MCH, we are creating public health surveys and embedding it into an application that will hopefully improve health workers’ activities in the field—especially for visits to mothers, newborns and children.
We initially thought back in January that a design probe testing messaging
between health workers in the field and doctors in distant clinics would be beneficial. Although this feature is a component of a mobile application, and is no longer a stand-alone feature used via SMS or IVR, we were still able to look at the messaging feature as a “probe” to determine if doctor-CHW communication would actually occur.
Our focus on health workers as the primary users of this technology still stands. Although our project scope largely focused on developing design probes and training materials that could be tailored for the health workers’ contexts once we arrived to India, our actual development timeline has not panned out this way. Instead, most of our time has been spent troubleshooting and revising the application through field-testing and evaluation, and we are trying to get to the training documentation completed and handed off to iKure in the last few days we are here.
There is much that needs to be accomplished in the last week working with iKure, but one thing that is important for us to keep in mind is that the core of our project—building a tool with and for health workers—is the same. It is our significantly shortened timeline that has affected what we can feasibly accomplish, given our partners’ high expectations and the amount of research and design that goes into this project.
Before we came to Kolkata, our team had spent the entire semester building a relationship with iKure and establishing what we thought was a justifiable and solid project plan. Our original intention was to design a technology-based tool for Indian Community Health Workers, which could support communication and problem-solving during their field visits. We fully intended to come to India with a few partially-built prototypes that we could get feedback on from the CHWs once we arrived.
Although we did not have a wealth of background context on the CHWs or their work environments from our partners, iKure’s seeming acceptance of our project plan convinced us to proceed down the path of iterative design and community-based participatory research, to build a nearly complete tool that supported CHWs’ roles.
However, things have not turned out quite as we planned during our time here in Kolkata; a discovery that has occurred on multiple levels of this project.
1) What is our actual project? Before we left for Kolkata, our team fully anticipated being able to do extensive contextual inquiry and pilot testing with a few small, yet flexible prototypes to determine what was most useful for CHWs given their workflow and variable health and technology literacy levels. Our partners at iKure seemed to agree with this plan. However a week prior to our departure, they essentially instructed that we build a “mobile” electronic health record… and scrap the project we had been working on for the last 4 months.
2) What about Community-Based Participatory Research and Contextual Inquiry? In public health practice and research, and in UX design, an essential process to developing a project or mHealth tool for communities is gathering input from these user groups, and evaluating their current contexts and routines. Although we attempted to begin this formative information-gathering from February to May, our partners never connected us with Community Health Workers–our primary user group–to conduct interviews via Skype or phone call. Furthermore, they seemed to express a disinterest in gathering the information we needed from the health workers, and did not understand why we needed to interact with CHWs or ASHA workers at all to build the technology tool.
3) What are our roles as a student team and within the iKure framework? This discovery was two-fold, because we not only needed to reshape what our individual roles were for the project, but also how we would contribute as a cohesive team to iKure’s operations and how we were viewed by their management.
Throughout the semester, we pretty much delegated Skype discussions to Nick, which meant he had to lead most of our conversations with iKure and answer the questions they had about the project. Although we thought this made logistical sense (we knew how to standardize the calls so nobody was talking over each other), we soon realized once we got to Kolkata that the iKure management believed Nick was the project manager, and would primarily interact with him–essentially ignoring Jackie and Anjuli.
We realized as a team that our roles needed to drastically change, especially since Nick had to take on the role as a developer and could not possibly balance this enormous task with project management, especially because our timeline for the project was shortened from a few months to a few weeks.
We decided that with Jackie as the project manager and UX specialist, Nick as the developer and health communication specialist, and Anjuli as the public health specialist and field testing manager, we were able to spread out our responsibilities more effectively, and better convey to iKure that we were a unit, a working team that contributed equal weight and insight to the project.
As designers for this project, our goals for the remaining weeks have been to gather as much information about CHWs as possible by pushing for frequent field visits (despite our partners’ reluctance), and improve our mobile tool using the feedback gathered from the health workers.
As consultants, we need to understand the future our project has in iKure, and hand-off documentation that includes: project sustainability recommendations, public health evaluations from our field visits, and technical style guides that standardize iKure’s existing technology framework and promote best practices for community-centered, action-oriented design and emphasize the importance of building with communities, instead of through a top-down approach.
We didn’t realize back in January that our partners had a completely different end-goal in mind for this project, but during our short time working in India, our team has learned that we need to be flexible with the current situation, just as our project needs to be flexible to support the needs of the Community Health Workers working in rural Kolkata. In addition to being flexible with the project, we want to continue pushing for visits with health workers, and hope that at the end of our time here, our partners finally see the value these visits have in determining how technology can be used effectively and appropriately in community health activities.
For our field visit on June 12, 2015, our team went to the Bhai Nagar community to interview and conduct second-round field-testing of the KOL-Health App with two ASHA workers. We had a few major objectives for testing this week, including:
Understanding health workers’ current door-to-door activities in communities through a series of interview questions.
Assessing health workers’ current level of health and technology literacy through interview questions and KOL-Health App testing scenarios
Evaluating the health workers’ competence and self-efficacy in navigating different features of the KOL-Health App
Determining improvements that need to be made to the App, to better support health workers’ roles and standardize health data collection activities in the field
Our testing approach included a combination of key informant interviews with two ASHA workers from the Bhai Nagar community, followed by a run-through of a few specific tasks within the KOL-Health App to determine how the health workers interacted with the application, and where they ran into difficulties with using it.
Learning about the ASHAs
Caring for women, children and infants has become the primary responsibility of health workers, ASHA workers in particular, throughout India. For the ASHAs working in Bhai Nagar, they described that the maternal-child health work they do is not only for the pregnant woman’s knowledge, but “for her entire family as well.” Although the first 30 minutes of one of these visits is spent taking vitals of the pregnant woman, the ASHA worker takes the rest of the 2.5 hour period to raise the household’s awareness around: pregnancy complications the mother may have, proper nutrition for the mother and children, vitamin and nutritional supplementation the woman needs, and immunizations the children need once they are born.
In addition to these specific points of advice, the ASHA also ensures that pregnant women are being cared for regularly during the antenatal and postnatal periods, by conducting regular house visits, taking the woman to the closest health facility for antenatal care, and arranging transportation to a hospital for delivery.
Addressing emergencies, unlike the process of conducting house visits to pregnant women, is a less structured and significantly more complicated responsibility that the ASHA workers face. When discovering a pregnant woman or child with a serious problem or health condition, the ASHAs must immediately try to flag down a vehicle–a difficult task in such a remote location. Until a form of transportation arrives, if it ever does, the ASHA workers must try to find some temporary solution to the problem at hand. Whether these guidelines are detailed in their handbooks or if the ASHAs are trained on addressing these situations is unclear.
Both ASHAs told us that they did not have an extensive knowledge of technology tools; however their use of technology in their everyday lives was vastly different. One said that she had a laptop, which she used to access her bank account and type, but she was not too familiar with mobile apps or smartphone technology. The other ASHA said she was “not very comfortable” with using technology, although she said she owned a touch phone. Whether this was a smartphone or not was unclear from our interview. She told us that she only uses her phone to do basic messaging and make phone calls–she too did not discuss any familiarity with mobile applications.
Overall Technical Findings
We noticed that there were patterns in issues that both ASHAs experienced with the application interface overall. Firstly, navigating to a specific form was difficult for both ASHAs, and they seemed extremely hesitant to click the “wrong buttons” or perform something incorrectly. In addition to being afraid of clicking buttons on the application overall, both ASHA workers did not seem to recognize the functionality of the “cancel” buttons we put on all the forms, nor did the circle icon at the top page make sense as a “save” button.
Making the application as straightforward and simple as possible for the health workers is the best solution, and that might involve placing actual “save” buttons on all the forms (not the circle icon), and “edit” buttons for every question, to allow health workers the flexibility to make changes if they do not feel comfortable cancelling out of a form.
Two major improvements that will need to be made to the KOL-Health App for our next field test are: increasing the font size on all the forms, and changing heading colors to make them distinct from the buttons. These were places where our team noticed the greatest user issues, and can easily fix for the upcoming week.
Recommendations and Improvements
As with our first field visit, this trip to Bhai Nagar was definitely focused on gathering more information from health workers, and garnering their feedback on the KOL-Health Application, as pertaining to their current activities in the field. Although this application was tested among ASHA workers and not iKure CHWs, we hope that this application will provide both sets of health workers with the ability to make data entry more efficient, and a central device in which they can gather and store patient information safely and securely.
After this visit to Bhai Nagar and after interacting with the ASHA workers , we now understand a few important points that need to be considered for further development of the KOL-Health App:
1. An immense amount of what the ASHAs do in their communities is establish relationships with the people and directly interact with them to raise awareness around health promotion and disease prevention.
Where KOL-Health fits in: The mobile tool that we are building for health workers should work alongside this existing workflow, which is highly interactive and embedded in social contexts of the community. The application’s interface should be simple and straightforward enough, so as not to detract from this existing relationship-building process that is central to the health workers’ roles.
2. The ASHA workers in this particular community view communication with a doctor as being useful. However, they are reluctant about this interaction currently, because they do not want to “disturb” the doctors in their work by calling them with a problem from the field.
Where KOL-Health fits in: Building a messaging feature is the next phase in our team’s development timeline for the application. This messaging component is ideally linked with a patient’s case information (vitals, symptoms, or a health survey), which can then be sent to a doctor in the closest health facility via a “referral” button or some other action. The doctor will essentially be able to view these messages whenever they log into the application site, which would forego the issue of health workers calling the doctors at inconvenient times.
3. Important initial reactions the ASHAs had to the application included: having a difficult time seeing the text due to small text size and screen glare, and being afraid or hesitant to click any buttons without extensive prompting.
Where KOL-Health fits in: Both increasing the font size of the text and altering color contrasts to lessen issues with screen glare are simple improvements to make in the application for the next iteration of our team’s design process. To address the health workers’ hesitation towards using the application—and any technology tool in general for their fieldwork—this comes with not only proper, context-appropriate development of the interface, given the health workers’ feedback, but also implementation guidelines that iKure can follow to better tailor and sustain this technology within the rural communities their health workers are a part of.
When we started working on our project we knew we needed to learning a lot about the community we were working with while attempting to over come cultural barriers. Our group was inspired how projects in Design for UNICEF used artifacts (or design probes) to elicit deep conversations with the communities they were embedded it. We sought to make our own design probes, so we could answer research and design challenges.
Unfortunately our attempts at fostering design probes and prototypes have been abandoned, and I felt it would be useful to outline the reasons we have abandoned this design process.
Our partners are a health organization that makes software in a traditional manner, so the idea of purposefully testing half-baked software really goes against the grain. They want to see fully fleshed out software, which is in opposition to creating prototypes where you focus on the main actions.
Content is premium
The design probes we were making, we hoped would elicit new content, and learn about the real needs of our users. Yet letting go of control of content is difficult for many organizations — especially health organizations where accuracy and trust are important aspects of information and communication design.
Our initial plan was to have our designs tested before we entered the country, yet because of the previous reasons our partner was hesitant about attempting this. So as our timeline shrank, we realized that our timeline for learning and production needed to get shortened.
We are hoping to engage our clients needs, and use our current plan to build a mobile application as a starting ground for this work. By engaging in a more traditional production process, we are hoping to better match our partners expectations of our work — even though that means moving away from a more open ended process.
Going into this project, the Cali-Cuttas well understood that working in India would be a far different experience from doing similar work in the United States. After having numerous troublesome Skype conversations– where the calls would drop at least 4 times or our partners would plan to be on the call an hour later than originally planned–we thought that actually being in Kolkata and interacting face-to-face with everyone would make the project deliverables much easier to accomplish.
However, as this is India, and considering we are working on a very complicated problem in health care delivery and public health research, the Cali-Cuttas have quickly realized that our problems were not simply going to disappear… much like the monsoon rains have here in India. A list of our ongoing difficulties working in Kolkata are outlined below:
1) Will we ever get an Uber driver who knows how to get around the city? Something seemingly straightforward (and easy) as calling an Uber cab to drive us to and from work takes almost 20 minutes both ways… and even then, the drivers don’t always know how to use their GPS to get us to our destination. A very frustrating task when work ends at 7:30 pm and we are tired and seriously hangry.
2) Adhering to office “rules.” There seem to be rules in place at the office, which mandates that employees come in promptly at 10:30 am and stay until 7:30 pm with a 45 minute lunch break at 2pm. Whether or not these rules are actually enforced is a good question. The HR person made a point to send an email to everyone stating these timings, and even questioned where we were when our team took a longer-than-expected lunch break. However, she asked someone else where we were, and never spoke to us directly. Does that mean we were in trouble for not obeying the 45 minute lunch rule?
It certainly seems that the unspoken expectation is that everyone shows up to work and stick to the timings. However, what you choose to do in that time is up to you. I definitely saw one of the interns spend an entire day of work watching movies on her computer.
3) One full day of “good” Wifi is a miracle. The only people who get consistent internet connection at the office are the people with an ethernet cable… not so for Jackie and Nick, who have newer versions of Macs that are incompatible with the cables. Upon returning to our housing around 8:30 pm or so, it is not uncommon to discover that none of us can connect to the hotel’s three different Wifi networks. We have asked numerous times for the staff to reset the servers, but of course that does not solve the problem. It really does seem that once work hours are over, our actual use of the internet is over as well.
4) Over 100 degree heat, enough said. Over 1100 dead from heatstroke and melting roads. We *might* be starting to miss the Ann Arbor chill… We don’t mean to be complainers. Working here in Kolkata has been mostly wonderful and interesting, and we have greatly enjoyed getting to know our co-workers. However, these continuous annoyances have made it slightly more difficult to get our work done everyday, which is something we really cannot afford on such a restricted timeline. Nevertheless, the Cali-Cuttas cannot be deterred! Not when we have taken the time to go on incredible adventures like these:
Our partners at iKure have told us numerous times that Community Health Workers generally have at most a 10th standard (high school equivalent) education, and many are unaccustomed to using electronics like smartphones or tablets in their fieldwork, let alone for their personal use.
This insight into the CHWs’ context and environment has brought up an important question about health and technology literacy, and how both are distinct from each other yet intertwined within our project.
What is “health literacy?” From our extensive literature search on health workers in various contexts, CHWs serve as the bridge between marginalized (often rural) communities and formal healthcare systems. From taking diagnostic health information to offering basic medical advice, CHWs must be versed in both procedural and theoretical aspects of health. This requires CHWs to not only communicate health metrics to communities, but more importantly, to understand the significance of these metrics and know the appropriate course of action or advice to deliver alongside the health data.
Although it is important to provide standardized and continuous training to CHWs that focuses on preventive care and health promotion, there is fine line between comprehensive health training, and giving CHWs information that is too academic and granular for them to use effectively in the field.
What is “technology literacy?” This concept has been a little more difficult to define, and has been a key focal point in the development of our project. If we are evaluating the extent of CHWs’ knowledge in using the KOL-Health Application, does this mean assessing their comprehension of the questions and fields they need to fill out? Is it being able to navigate easily through the different screens in the application? Is it understanding the terminology ascribed to each page (e.g. using the term “case” to refer to a new visit, or encounter, with a patient).
Of course, we plan to address all of the above questions through different phases of usability testing with CHWs. By using observational techniques, direct interviews and mock run-throughs of the application, we hope to better understand if:
(1) Language barriers are a significant problem (Hindi-Bengali vs. English script)
(2) The application is easy or difficult for the CHWs to navigate through
(3) The sample content in the application is too mechanistic or academic
(4) CHWs will be competent and comfortable using the KOL-Health App regularly, and feel it a better data collection model over paper forms.
It is one thing to train health workers a piece of technology to use in their work; it is entirely another thing to evaluate the tool, gather input from health workers, and actively consider “literacy” issues to create a product that best enhances healthcare delivery activities of these users.
We had the fortune on Monday to attend an urban heath camp in action to better understand where technology can improve health workers’ workflow in delivering healthcare to poor communities in urban and rural Kolkata. We visited the Luxuria Heights construction community where iKure had partnered with a local organization to set up a health awareness and screening camp.
This opportunity to visit a health camp was a fantastic learning space because we were able to get an idea of how the camp was run, how different stations were set up, how information was passed along from the health workers to doctors, and how long the process took.
On average there were 7-10 health workers at the camp at any one time, including at least one doctor (who wore a white lab coat) as well as a few paramedics, lab techs, and pharmacists (who wore lab coats of dark blue). These workers manned 5 major stations: patient registration and height/weight measurements, blood pressure and heart-rate monitoring, general consultation and triage of symptoms, blood testing, and ECG analysis when necessary. Since this was a construction site they did not have a station set up for women’s health in the way that they normally would.
There was a logical flow within the camp going from the basic height and weight measures through to the more in-depth blood tests and consultations with the doctor. From observing individual stations and following individual patients we were able to understand that each patient spent an average of 1.5-2 minutes at each station and that the entire process from registration through check-out took about 6-10 minutes per patient.
Some of our observations and questions are below:
Intake and height/weight: Took about 90 seconds for patients to get through. One health worker gave the patients a white index-sized card to fill out basic information, while he took notes on a clipboard.
Query 1: Who asks if these are new or returning patients? Is this a consistent question asked at the health camps? Is there somewhere on the paperwork to check off if a patient is “new” or “returning”?
Query 2: Is there additional health information (e.g. health habits) that the health worker jots down at this station?
Blood Pressure and Heart Rate: Took about 90 seconds for patient to get through both procedures, conducted by a pharmacist or paramedic. The results from these vital tests were written on the back of the index-size Health Screening Cards.
General Triage: Took about 90 seconds to go through this station. Conducted by a certified physician/clinician, who also wrote down additional free-form notes for patient on a larger paper form, which had information transcribed from the Health Screening Card.
Query 3: What type of questions is the MD asking during the consultation? Are they consistent? Do they vary from patient to patient?
Query 4: What notes is the clinician generally taking down on the prescription sheet? Only prescription recommendations? Tailored health advice?
Query 5: When do the MDs need to enter the ICD 10 codes? Is this on the prescription form currently, or is it entered at the end of the clinic into WHIMS?
Query 6: What is the purpose of having both the Health Screening Card and the prescription sheet? Seems like additional paperwork for health workers, especially if they are transcribing information from the cards to the larger forms, and the patients are not keeping the cards or end up losing them.
ECG: No time estimate, since we only saw one patient referred here. We assume this station is for patients with more serious conditions (i.e. pain).
Query 7: If women were also attending a health camp, would the OBGYN station be located at this point in the clinic flow?
Blood Testing: About 90 seconds for patients to go through this station. Involves two health workers administering a finger prick, creating blood sample slide and writing down patient information from the prescription sheet and Health Screening Card.
Query 8: Do all health camps end with the blood testing station?
Query 9: Do patients actually save the Health Screening Cards? What is done with them after this station?
Final Data Collection: At the end of the clinic, iKure staff members photocopy the large forms and hand one to the patients, while the other is used for data entry into WHIMS.
Query 10: Would having all the information from the Health Screening Card on the prescription sheet (after the blood test station) make data transfer to WHIMS easier/ more efficient?
Query 11: What is the process for obtaining prescriptions? Do patients have to mention pain to get paracetamol? What is the process for getting iron tablets?
Query 12: On average, how many patients are prescribed paracetamol, versus iron tablets or other medications offered at the health camps? Is this given based on doctors’ notes on the larger prescription form?
Query 13: Where are patients getting prescriptions filled?
Query 14: What is the supply of medications at health camps? Does this differ between urban and rural settings?
Perceived Issues and Challenges
Data entry is time consuming, as every station involves paper forms
However, paper forms allow doctors to write more tailored information to patients, which may be challenging with an electronic form
Pervasive use of paper records increases chance of human error in data entry, which then contributes to miswritten records stored in WHIMS
Duplication of patient records is a big problem
Heavy turnover of doctors at health camps means doctors may not know if patients are new or returning to the iKure clinics; they enter patients as “new” and generate multiple PIDs for the same patient
Health workers may not be accustomed to ask if patients are “new” or “returning”
Trying to identify returning patients by name is an issue… misspelling of names is common, and WHIMS cannot flag a patient’s name as being misspelled
Query 15: How else can patients be flagged as returning? By DOB? Age? Site visited? Medical history?
So all in all we learned a lot, but as usual ended the day with more questions than we had at the beginning of the day.