Quality Assessment and Usability Inspection of the Home and School Food Consumption (CADE) Application

Abstract

Objective: To evaluate the quality and inspect the usability of the Home and School Food Consumption (CADE) application. Methods: This was a cross-sectional study conducted with an online tool developed for mobile devices (smartphones and tablets) to assess the food intake of schoolchildren. Five specialists in Human-Computer Interaction (HCI) performed the heuristic evaluation of the application, also evaluating the degree of severity of the problems found. Five health professionals evaluated the quality of the application and answered a questionnaire, which included 21 questions. The frequency of heuristic violations, the most reported problems, and the average severity of the heuristics were estimated. Results: A total of 74 usability problems were identified, of which 17.8% violated the heuristic of flexibility and efficiency of use. Conclusion: The use of the application was considered easy to learn how to use by health professionals, who made several suggestions for improving the application.

Share and Cite:

do Nascimento, J.V., de Lima, A.N., Neto, O.J.M. and Araujo, M.C. (2025) Quality Assessment and Usability Inspection of the Home and School Food Consumption (CADE) Application. Food and Nutrition Sciences, 16, 787-798. doi: 10.4236/fns.2025.167044.

1. Introduction

The development and use of technologies to collect dietary data in childhood has been widespread in the last decade, but is still scarce in low- and middle-income countries, such as Brazil. Some examples are web-based software and mobile phone applications [1]. These tools offer advantages in standardizing and streamlining data collection time, reducing participants’ workload, increasing estimates’ accuracy, and curbing research costs [2].

In this sense, the Home and School Food Consumption (CADE) application was developed considering the scarcity of available tools that assess the food consumption of Brazilian schoolchildren. It is an online tool created for mobile devices (smartphones and tablets) that assesses the food intake of Brazilian schoolchildren in all their food environments, emphasizing home and school environments [3].

However, when developing a new technology, it must be evaluated for its usability. The ISO 9241 standard defines usability as “the extent to which specific users can use a system, product, or service to achieve specific goals such as effectiveness, efficiency, and satisfaction in a specific context of use” [4]. This is an essential step in the development of technology, as it allows identifying problems and errors that, when corrected, increase the effectiveness and efficiency of the client’s use of the technology, avoiding inconsistencies, reducing the learning time for use, and increasing client satisfaction with the tool [5].

One of the methods for evaluating the usability of software is usability inspection, which can be applied at any stage of application development. Heuristic Evaluation (HE) is the most widely used [6] among the existing inspection methods. In this assessment, a small group of Human-Computer Interaction (HCI) experts, usually three to seven experts, analyze the interfaces and judge their characteristics based on recognized principles called heuristics to identify possible design errors that compromise the system’s usability [7]. This method also indicates the level of severity and possible solutions for the observed problems [8].

The system’s overall quality in terms of usability can also be analyzed through questionnaires with users who are not HCI experts. In this case, the evaluation is called a survey evaluation. It aims to identify qualitative aspects considered necessary for the system’s context and estimate the overall satisfaction of the system’s users [9]. Thus, this study aimed to evaluate the quality and inspect the usability of the Home and School Food Consumption (CADE) application.

2. Methods

2.1. Study Design and Population

This cross-sectional study selected a convenience sample of 10 usability experts, indicated by the research group of the interactive web and multimedia systems laboratory of the Institute of Mathematical and Computer Sciences of the University of São Paulo, to evaluate the application’s usability inspection, of which five agreed to participate in the test. All professionals had a doctorate in the field and HCI experience. We also adopted a convenience sample of 13 health professionals with experience in food consumption assessment or knowledge of the use of technologies applied in dietary assessment. Participants were selected by searching for CVs on the Lattes platform or studies published in the established fields. Five of these professionals agreed to participate in the test.

According to Nielsen [10], the creator of heuristic evaluation, the participation of five information technology professionals—preferably usability experts—is sufficient to identify approximately 75% to 80% of the usability issues. Furthermore, the author suggests that usability studies involving end users, such as the one conducted with nutritionists in the present research, can also achieve reliable results with just five participants [11].

2.2. CADE Application—Home and School Food Consumption

The CADE application—Home and School Food Consumption (INPI-BR512021001151-1)—is an online tool developed for mobile devices, such as smartphones and tablets, to assess the food intake of Brazilian schoolchildren aged 4 to 9. It is the only Brazilian technology that quantitatively and qualitatively assesses food consumption in all the children’s food environments. The CADE application has three user interfaces: school, home, and the interface for entering data previously collected on paper or in another tool. The school environment records what the child eats at school through a food record. The home environment includes the child’s food intake inside and outside the home, which must be recorded by the person responsible for the child at home who knows the child’s diet on the previous day through a 24-hour multi-step recall combined with a Food Propensity Questionnaire (FPQ).

2.3. Data Collection

The tests were conducted in two stages: first, the application was inspected by five HCI experts using the Heuristic Evaluation (HE) method. Before the evaluation, each expert received an email with instructions on how to access the mobile application (available only for the Android operating system), with the link to download the application, login, password, and three codes for a fictitious child to access the three interfaces of the application: school, home, and data previously collected on paper.

Since the heuristics developed by Nielsen are intended for desktop systems, that is, they do not involve the evaluation of mobile devices, these heuristics have undergone several revisions and have been adapted for new technologies, such as mobile devices and tablets [10], over the years. In this context, Gresse von Wangenheim [12], based on a systematic review, adapted Nielsen’s traditional usability heuristics for touchscreen mobile applications through a structured three-step process. First, the set of heuristics was decomposed into a collection of measurement items representing the abstract quality construct of usability. These items were then operationalized into a questionnaire (referred to as a checklist), grounded in device-specific interpretations and common usability problems. Finally, the checklist was validated through an empirical study, in which the results of 247 heuristic evaluations were statistically analyzed using Item Response Theory (IRT).

Therefore, a set of 10 principles and guidelines that represent the heuristics adapted for the context of Touchscreen cell phones and validated by Gresse von Wangenheim [12] was proposed based on the 10 heuristics proposed by Nielsen [10].

The heuristics were unified in a checklist and available in the cloud via a Google spreadsheet. Besides the proposed heuristics, the checklist allowed the evaluator to add more heuristics if they identified any problems or errors during the evaluation. The evaluators received an email link to the checklist to inspect the entire system, comparing the application interfaces with the heuristics to identify possible interface errors.

Thus, each expert recorded the problem(s) found in the application, the violated heuristic, the location where the problem was found, how the problem was identified, and its level of severity, which was divided under the scale proposed by Nielsen: 1) Not a usability problem; 2) Cosmetic problem (lower severity); 3) Minor problem; 4) Major problem; 5) Catastrophic problem (greater severity) [10]. Suggestions for improvement were also collected. The test was conducted remotely from April 12 to June 12, 2023.

Then, five healthcare professionals evaluated the application. Before the evaluation, each professional received an email with instructions for accessing the mobile application (available only for the Android operating system), including a link to download the application, login, password, and codes for the fictitious child related to the school and home environments. The email also contained a video demonstrating each application interface and an instruction manual for accessing the application.

After accessing the application, the professionals evaluated the quality of the CADE application. Then, they completed an electronic questionnaire made available via the Google Forms platform, consisting of eight blocks of questions regarding the application’s functionalities and food items in the school and home interfaces, totaling 21 questions. The test was run remotely from June to August.

2.4. Data Analyses

The usability problems found were categorized by problem type identified and violated heuristic type. The frequencies of the problems mentioned, the heuristics violated, and the mean severity of the most reported problems by each heuristic principle were estimated, and the problems identified were described.

3. Results

A total of 88 usability problems were found in the evaluation carried out by HCI professionals. Four of these did not fit into any heuristic and referred to the abrupt termination of the application for unidentified reasons. After grouping the reported problems based on similarity, 74 distinct problems were identified, which violated the heuristic principles 135 times, and 43 problems violated more than one heuristic. An evaluator identified a new heuristic besides the previously proposed heuristics, which he called “error prevention” (Table 1).

Table 1. Frequency of heuristic violations (%) and the mean severity of problems found by each heuristic principle.

Heuristics

Frequency of heuristic violations

Mean severity score

(minimum - maximum)

n

%

Flexibility and usage efficiency

24

17.8

3.9 (2 - 5)

System status visibility

21

15.6

4.3 (2 - 5)

System control and freedom

21

15.6

4.3 (2 - 5)

Consistency and patterns

19

14.0

3.4 (2 - 5)

Correspondence between the system and the real world

11

8.1

4 (2 - 5)

Recognition instead of remembrance

9

6.6

4.5 (3 - 5)

Little human/device interaction

8

6.0

4.3 (3 - 5)

Readability and layout

8

6.0

2.8 (2 - 4)

Aesthetics and minimalist design

7

5.1

4 (2 - 5)

Physical interaction and ergonomics

5

3.7

4.1 (3 - 5)

Error prevention

2

1.5

5 (-)

Total

135

100

The most violated heuristics were “flexibility and usage efficiency” with 17.8% (n = 24), “system status visibility” with 15.6% (n = 21), and “system control and freedom” with 15.6% (n = 21). On the other hand, the heuristics “error prevention” 1.5% (n = 2), “physical interaction and ergonomics” 3.7% (n = 5), and “aesthetics and minimalist design” 5.1% (n = 7) were the least violated (Table 1).

Most heuristics were violated by severe or catastrophic usability problems (mean ≥ 4), with the highest mean severity found for problems associated with the heuristics “error prevention” (mean 5), “recognition instead of remembrance” (mean 4.5), “system status visibility” (mean 4.3), “system control and freedom” (mean 4.3), and “little human/device interaction” (mean 4.3). The lowest mean severity was found for problems associated with the heuristic “readability and layout” (mean 2.8) (Table 1).

When analyzing the nature of usability problems, the heuristic evaluation revealed 10 problems identified by more than 1 evaluator, and “unable to advance or go back through the application screens” was the most reported (n = 4) and had the highest level of severity. Other problems with the highest level of severity were “registered food field appeared blank after registration”, “unable to modify registered child data”, and “food information presented on different screens” (Table 2).

In the quality assessment conducted by healthcare professionals, the CADE app was considered easy to learn and use. However, some of the feedback provided by these professionals corroborates issues previously identified by HCI experts. These similarities are evident, for instance, when healthcare professionals reported that the mobile app interface did not effectively support the process of completing the required information—40% (n = 2) (Table 3)—and that the questions regarding commonly forgotten foods were perceived as repetitive and tiring—60% (n = 3) (Table 4).

Table 2. Usability problems most reported by evaluators by severity classification.

Most reported usability problems

Number of evaluators

Severity

Cosmetic (minor)

Minor

Major

Catastrophe (major)

Mean severity score

Unable to advance or go back through the application screens

4

0

0

0

4

5

Registered food field appeared blank after registration

2

0

0

0

2

5

Unable to modify registered child data

2

0

0

0

2

5

Food information presented on different screens

2

0

0

0

2

5

Record of commonly forgotten foods and drinks

3

0

1

0

2

4.3

Buttons do not reflect the tasks when clicked

3

0

0

2

1

4.3

Time field

3

0

1

1

1

4

Long waiting time for login and password to appear

3

0

1

1

1

4

The system does not save login data

2

0

2

0

0

3

Screen colors

3

1

2

0

0

2.7

Total

24

2

10

5

15

Regarding the section of questions related to completing the 24-hour recall in the home environment, only one professional reported the absence of a field to specify the reported day or noted a missing item among the selection options (Table 4). In the section addressing the detailed description of food items in both school and home settings, the majority of professionals (60%, n = 3) experienced difficulties accurately detailing food items, selecting appropriate food portion photographs, or uploading meal images. Additionally, 40% (n = 2) recommended modifications to the question concerning repeated foods and reported challenges documenting food leftovers (Table 4). With respect to the application overall, 60% of healthcare professionals characterized their initial impression of the app as average, while 40% rated it as good.

Table 3. Frequency of responses from healthcare professionals in assessing the quality of the Home and School Food Consumption (CADE) application.

Question blocks about the application

Yes

Comments

n

%

General questions about the application

Difficult to learn how to use the application

0

0

____

The mobile application interface did not assist in the development of the steps for filling in the requested information

2

40

“The back-and-forth buttons on the phone itself were confused with the buttons on the app.”

“Keyboard covering the green buttons (next, finish).”

“It’s hard to go back and correct things.”

“It takes too many clicks to perform actions.”

Some application functionality takes longer to respond

4

80

“Filling in the food consumption at home.”

“The Food Propensity Questionnaire part.”

Some app features are confusing or inappropriate

2

40

“For YOGURT, the picture did not appear when I clicked ‘children’s bowl, see picture’. This also happened with other foods.”

“Some foods were not easy to find.

Step on general information about the child

Some fields regarding general information about the child are missing

1

20

“A field to enter the child’s date of birth.”

Table 4. Frequency of healthcare professionals’ responses (%) about the interfaces and the Home and School Food Consumption (CADE) application’s functionalities.

Question blocks about the application’s interfaces and features

Yes

Comments

n

%

Assessment of the home environment interface

There was a missing field to include on the reported day

1

20

“Daily water consumption.”

Some item(s) is (are) missing in the fields that have selection options

1

20

“Review the list of supplement frequencies. Adapt them to the recommendations of the Ministry of Health.”

Reporting of food items (school and home environment)

Difficult to report a food item

2

40

“Difficulty finding the food item on the list.”

“The app’s ‘Continue’ button is below the keyboard.”

“Restricted list of food items.”

Questions about commonly forgotten foods (school and home environment)

There was a question missing in the commonly forgotten foods stage

0

0

____

Questions about commonly forgotten foods are tiresome or repetitive

3

60

“The question about fluid intake.”

“Having to answer when a food item is forgotten.”

Detailing of food items (school or home environment)

Difficulty in detailing food items

3

60

“I couldn’t see the pictures of the children’s cups because they didn’t open.”

“The photo took a long time to load (more than 1 minute), and I gave up.”

Difficulty selecting a picture of the food portion

3

60

“The photo didn’t appear; it took a long time to load (more than 1 minute), and I gave up.”

There is a missing field for completing information about the details of food items

2

40

“The truth is that the items are in alphabetical order, which doesn’t make much sense, e.g., when you search for ‘meat’ the first thing that appears is mashed meat, for example.”

Suggested any changes to the question about repeating foods

2

40

“It ends up influencing someone to repeat a food that has already been reported. So, it doesn’t make sense to ask whether someone ate the food when they have already reported it.”

You disagreed with how the question was asked about the origin of the food consumed by the child in a school environment

0

0

____

Difficulty in taking a picture of the meal

3

60

“The program crashes, and I can’t cancel it after waiting 7 minutes.”

“My camera didn’t open.”

Difficulty recording leftovers of a food item

2

40

“I didn’t get there.”

“I didn’t find that option.”

Assessment of the home environment interface

I would add/remove some options in the field regarding meal consumption locations

0

0

____

I would include some other selection options in the question about doing activities while the child ate

0

0

____

4. Discussion

The CADE application was considered easy to learn and use by professionals with experience in food consumption assessment. However, IT professionals identified 74 distinct usability problems, of which more than half were considered to be of a higher level of severity.

The “flexibility and usage efficiency” heuristic was one of the most violated and associated with problems most reported by experts. One of the problems that violated this heuristic was found in the screens referring to questions about frequently forgotten foods, in which the expert described that when answering “yes” to an item forgotten to be reported, the entire flow starts again, not returning to where the user left off. The violation of this heuristic indicated that the application interface needs to meet the needs of lay and experienced users, allowing users to interact quickly as they become familiar with the interface.

The most frequently reported issues also violated the “system control and freedom” heuristic. HCI experts pointed out that it was impossible to return to previous steps depending on when the form was completed, which prevented users from changing the child’s data in the application. Healthcare professionals have also reported this same issue, suggesting that it is a severe problem that should be fixed as a priority. As an improvement, experts suggested that the app allow users to go back to previous steps using a back button.

The comparison of this study’s results with current academic literature is limited, considering that no national or international study evaluating usability inspection using the heuristic evaluation method in a technological tool applied primarily for dietary assessment was found. The studies are mainly international and emphasize usability inspection of technologies used for health purposes. However, we identified studies that performed usability tests of technologies applied in the children’s dietary assessment, but with their end users, which is different from the present study’s analysis that investigated the usability inspection by an HCI expert. Thus, the comparison presented here was made considering similar studies.

Among the Brazilian studies found, we can mention the work by Ruggeri et al. [13], who tested the usability of a Health and Food Monitoring System—School Nutrition (NUTRISIM), a computerized system for data collection, assessment, and monitoring of the schoolchildren’s health and nutritional status. In this study, 17 IT professionals evaluated the tool’s usability via the Systems Usability Questionnaire to express their opinions regarding six metrics (ease of learning and recall, error control, efficiency, effectiveness, and satisfaction). Like the present study, the authors observed that the system was easy to understand and use.

The “aesthetics and minimalist design” heuristic was one of the least violated heuristics in the present study, in contrast to the study in the United States by Fu et al. [14], who found more violations for this heuristic when evaluating the usability inspection of four applications developed to assist in the treatment of diabetes. Regarding the severity of all problems found, the values obtained by the authors were similar to ours: the authors categorized more than half of the problems as significant usability and catastrophic and a smaller number of cosmetic problems.

Although heuristic evaluation is an inspection method remarkably capable of finding usability problems, it is recommended to associate heuristic evaluation with other usability evaluation methods, such as user testing [15]. The two usability evaluation methods complement each other, as heuristic evaluation involves experts interacting with the interface to identify issues, while user testing employs questionnaires or interviews to gather users’ opinions and experiences [16] [17].

One of the results found in the present study by health professionals was the ease of learning how to use the application. A similar result was found in the study conducted by Davies et al. [18], which aimed to report the discussions of focus groups held with nutritionists about the Food Consumption and Physical Activity of Schoolchildren (CAAFE) questionnaire, a tool for online monitoring of food intake of schoolchildren aged 7 to 10 years. One of the results found by the authors was the ease of use of the instrument’s online version.

Also, in the study by Contreras-Guillén et al. [19], which aimed to develop an automated open-access tool (MAR 24) for collecting 24 hR using the multi-step method for adults in Argentina, seven nutritionists performed a qualitative and quantitative evaluation of the tool, and 71.4% of the professionals reported that the tool is easy to use.

In the present study, the CADE app was considered easy to learn and use by professionals experienced in dietary intake assessment. Moreover, it was observed that the issues identified by HCI specialists closely matched those reported by healthcare professionals. A common problem noted by both groups was difficulty navigating forward and backward within the app screens, which impeded the correction of reported information. Both groups also indicated that the section for recording commonly forgotten foods and beverages was repetitive and tiresome, and that certain app buttons were confusing and did not function as expected.

One limitation of the present study is the lack of evaluation of one of the CADE application interfaces. This interface was developed to allow the entry of dietary data previously collected in the traditional dietary assessment format (paper). Although this interface had been instructed to be tested, HCI experts did not test it, and healthcare professionals made no comments. However, this interface is similar to the home environment interface tested. Therefore, we consider that the problems identified for the home environment are relevant to the interface that was not tested.

5. Conclusions

This study was not conducted to identify all the CADE application’s usability problems. However, it demonstrated that heuristic evaluation with a few experts can identify many usability problems and improve the application. In this sense, considering the problems reported in the heuristic evaluation is vital for improving the application’s overall usability, product quality, and user satisfaction.

The evaluation of the CADE app’s interfaces and functionalities by healthcare professionals was valuable, as the findings of this study will enable the improvement of the tool and contribute to future usability studies involving technologies that assess dietary intake, particularly in childhood. It is important to highlight that this study was the first to conduct a usability inspection using the heuristic evaluation method on a technology designed to assess the dietary intake of school-aged children, thus providing a foundation for subsequent research on this topic.

The results of this study will improve the CADE application and be useful for computing and health experts interested in bettering the usability of mobile device applications. Therefore, the application’s refinements should focus on the most severe problems and those most reported in tests, such as not being able to advance or go back through the application’s screens, the registered food field appearing blank after registration, unable to modify the registered child data; food information presented on different screens; difficulty in detailing food items, registering a photo of the meal, and selecting a photo of the food portion, and record of commonly forgotten foods and drinks.

Acknowledgements

This study was supported by FAPERJ—Carlos Chagas Filho Foundation for Research Support of the State of Rio de Janeiro (SEI Process: E-26/201.420/2022) and CNPq—National Council for Scientific and Technological Development (PIBIC and PIBIT scholarship holder).

Ethical Approval

This study was approved by the Research Ethics Committee (REC) of the Sergio Arouca National School of Health of the Oswaldo Cruz Foundation (ENSP/FIOCRUZ—CAAE 36083120.80000.5240).

Declaration

The lead author affirms that this manuscript is an honest, accurate and transparent account of the study being reported. There are no important aspects of the study that have been omitted and any discrepancies from the study as planned have been explained.

All authors reviewed and commented on subsequent drafts of the manuscript and approved the final manuscript.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Mata, J.d.S., Freitas, J.V., Crispim, S.P., Interlenghi, G.S., Magno, M.B., Ferreira, D.M.T.P., et al. (2023) Technological Tools for Assessing Children’s Food Intake: A Scoping Review. Journal of Nutritional Science, 12, e43.
https://doi.org/10.1017/jns.2023.27
[2] Eldridge, A.L., Piernas, C., Illner, A., Gibney, M.J., Gurinović, M.A., De Vries, J.H.M., et al. (2018) Evaluation of New Technology-Based Tools for Dietary Intake Assessment—An ILSI Europe Dietary Intake and Exposure Task Force Evaluation. Nutrients, 11, Article No. 55.
https://doi.org/10.3390/nu11010055
[3] Freitas, J.V., Crispim, S.P. and Araujo, M.C. (2022) Development of a Mobile Application to Assess Brazilian Schoolchildren’s Diet: CADE—Food Consumption at Home and at School. Journal of Nutritional Science, 11, e27.
https://doi.org/10.1017/jns.2022.25
[4] International Organization for Standardization (2018) ISO 9241:11:2018 Ergonomics of Human-System Interaction—Part 11: Usability: Definitions and Concepts. 2nd Edition, ISO, 29 p.
[5] Bevan, N., Claridge, N. and Petrie, H. (2005) Tenuta: Simplified Guidance for Usability and Accessibility. Proceedings of HCI International, Las Vegas, 22-27 July 2005, 1-8.
[6] Maramba, I., Chatterjee, A. and Newman, C. (2019) Methods of Usability Testing in the Development of eHealth Applications: A Scoping Review. International Journal of Medical Informatics, 126, 95-104.
https://doi.org/10.1016/j.ijmedinf.2019.03.018
[7] Rocha, H. da and Baranauskas, M.C. (2003) Design e avaliação de interfaces humano-computador. NIED/Unicamp.
[8] Ahmad, N., Rextin, A. and Kulsoom, U.E. (2018) Perspectives on Usability Guidelines for Smartphone Applications: An Empirical Investigation and Systematic Literature Review. Information and Software Technology, 94, 130-149.
https://doi.org/10.1016/j.infsof.2017.10.005
[9] Martins, A.I., Queirós, A., Rocha, N.P. and Santos, B.S. (2013) Avaliação de Usabilidade: Uma Revisão Sistemática da Literatura. Iberian Journal of Information Systems and Technologies, 11, 31-44.
https://doi.org/10.4304/risti.11.31-44
[10] Nielsen, J. (1994) Heuristic Evaluation. In: Nielsen, J. and Mack, R.L., Eds., Usability Inspection Methods, John Wiley & Sons, 25-62.
[11] Nielsen, J. (2012) How Many Test Users in a Usability Study. Nielsen Norman Group.
https://www.nngroup.com/articles/how-many-test-users/
[12] Gresse von Wangenheim, C., Witt, T.A., Borgatto, A.F., Nunes, J.V., Lacerda, T.C., Krone, C., et al. (2016) A Usability Score for Mobile Phone Applications Based on Heuristics. International Journal of Mobile Human Computer Interaction, 8, 23-58.
https://doi.org/10.4018/ijmhci.2016010102
[13] Ruggeri, B.F.F., Voci, S.M., Borges, C.A. and Slater, B. (2013) Assessment of the Usability of a Nutritional Epidemiology Computerized System. Revista Brasileira de Epidemiologia, 16, 966-975.
https://doi.org/10.1590/s1415-790x2013000400016
[14] Fu, H.N.C., Rizvi, R.F., Wyman, J.F. and Adam, T.J. (2020) Usability Evaluation of Four Top-Rated Commercially Available Diabetes Apps for Adults with Type 2 Diabetes. CIN: Computers, Informatics, Nursing, 38, 274-280.
https://doi.org/10.1097/cin.0000000000000596
[15] Dix, A. (2009) Human-Computer Interaction. In: Liu, L. and Özsu, M.T., Eds., Encyclopedia of Database Systems, Springer US, 1327-1331.
https://doi.org/10.1007/978-0-387-39940-9_192
[16] Santana, C.A., Alcantra, R.A., Siebra, S.A. and Ávila, B.T. (2018) Comparando métodos de avaliações de usabilidade, de encontrabilidade e de experiência do usuário. Informação & Tecnologia, 3, 83-101.
[17] Dias, C. (2003) Usabilidade na Web: Criando portais mais acessíveis. Alta Books.
[18] Davies, V.F., Kupek, E., de Assis, M.A., Engel, R., da Costa, F.F., Di Pietro, P.F., et al. (2014) Qualitative Analysis of the Contributions of Nutritionists to the Development of an Online Instrument for Monitoring the Food Intake of Schoolchildren. Journal of Human Nutrition and Dietetics, 28, S65-S72.
https://doi.org/10.1111/jhn.12209
[19] Contreras-Guillén, I.A., Leeson, S., Gili, R.V., Carlino, B., Xutuc, D., Martins, M.C.T., et al. (2021) Development and Usability Study of an Open-Access Interviewer-Administered Automated 24-h Dietary Recall Tool in Argentina: MAR24. Frontiers in Nutrition, 8, Article 642387.
https://doi.org/10.3389/fnut.2021.642387

Copyright © 2025 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.