Insights on usability testing: The effectiveness of an adaptive e-learning system for secondary school mathematics

Insights on


INTRODUCTION
The digital revolution has led to significant changes in various sectors, notably in education, which has widely adopted these advancements.E-learning, leveraging electronic media and information technologies, has become a vital force in the educational landscape, offering enriched experiences, flexibility, and the capability to overcome both geographic and temporal barriers (Bell & MacDougall, 2013;Reeves et al., 2017).The significance of e-learning became even more pronounced with the onset of the COVID-19 pandemic.As schools and institutions worldwide grappled with closures, e-learning platforms provided the continuity needed to ensure that learning was not disrupted, highlighting their indispensable role in modern education.
Traditional e-learning platforms, often characterized by a static, one-size-fits-all approach, have shown limitations, particularly when addressing the diverse learning needs of students.Such platforms tend to offer the same learning experience for everyone, which may not suit each student's pace, preferences, and prior knowledge (Alghabban & Hendley, 2021).Adaptive e-learning systems are emerging as a better solution.These systems change in real-time to fit each student's learning needs, making sure the content matches their current understanding and pace (Taurah et al., 2020).They continually assess a student's performance and adjust the content, ensuring personalized instruction at the right level (Brusilovsky, 2007;Sweta & Lal, 2017).In the context of secondary school education, adaptive e-learning system has marked out a significant role, especially when it comes to subjects like mathematics.The adaptive features of the e-learning systems provide a blend of interactive exercises, real-time feedback, and personalized learning paths, which offer students a dynamic way to engage with and understand mathematics concepts (Kem, 2022).
A strong mathematical foundation at an early level of education is crucial for school children.It leads not only to their academic success but is also the key to excelling in more advanced mathematics studies at the tertiary level (Musiimenta et al., 2019).Furthermore, students with a good mathematical foundation may explore many other exciting opportunities in the expanding fields of science, technology, engineering, and mathematics (STEM) (Wang et al., 2013).These statements are true only if the instructional materials to teach mathematics are well-designed and effectively implemented.Research has shown that the quality of instructional materials significantly impacts students' learning outcomes in mathematics (Kul et al., 2018).Well-designed instructional materials can provide students with clear explanations, engaging activities, and opportunities for practice and application.
On the other hand, poorly designed materials and materials that contain ambiguities, unclear language, confusing representations, or even misrepresentations will result in frustration, hinder learning, and create lasting misconceptions (Ainsworth et al., 2002).Students may struggle to understand the concepts being taught and continuously make mistakes in their calculations.Such negative experiences with poorly designed materials can contribute to students feel less confident in their ability to solve mathematics problems, math anxiety, negative attitudes towards the subject, and their academic performances (Barroso et al., 2021;Khasawneh et al., 2021;Ramirez et al., 2018).
From a teaching point of view, teachers who use poorly designed materials and experience usability issues with the materials will need to spend more time to clarify the concepts, address students' confusion, and answer student questions arising from the materials' shortcomings.This adds to teachers' burden, both physically and cognitively (Ma'arop & Embi, 2016) and reduces their focus on core instruction.In addition, the time spent addressing usability problems takes away from opportunities to explore deeper mathematical concepts, hindering learning progress and potentially affecting curriculum coverage.
Therefore, selecting and developing quality instructional materials is crucial to minimize confusion and maximize learning outcomes for students.Usability testing plays a significant role in this process by identifying any shortcomings or issues within the materials that may hinder student understanding and engagement.By conducting thorough usability testing, teachers can address these concerns and make necessary revisions to ensure that the instructional materials are accurate, user-friendly, and effective in promoting student learning in mathematics.
Thus, the study explored the potential of adaptive e-learning, with a focus on secondary school mathematics.The objective was to gather insights on the usability of a newly developed adaptive e-learning system prototype, named Mythematix, to contribute to evidence-based design principles for improving adaptive e-learning of mathematics.The study, therefore, posed two primary research questions to deepen our understanding of Mythematix's impact: 1. How do the students perceive the usability of the developed Mythematix prototype?2. What are the key strengths and areas of improvement for the Mythematix prototype as identified by user feedback?These questions were essential in evaluating the overall potential of the Mythematix system in the context of secondary school mathematics education and in determining its effectiveness in meeting the diverse learning needs of students.

LITERATURE REVIEW Adaptive E-Learning Systems in Mathematics Education
In the context of mathematics education, the integration of adaptive e-learning system has been promising.Empirical studies proved that the use of adaptive e-learning systems has improved students' performance in mathematics (Jonsdottir et al., 2017;Masood & Mokmin, 2017), provide a positive learning experience for students (Liu et al., 2017;Walkington & Bernacki, 2019) and increased learning effectiveness (Hubalovsky et al., 2019).
The effectiveness of adaptive e-learning system has been very significant especially during the COVID-19 pandemic.Despite of the lack of research conducted at the school level, few researchers recorded that the individualized learning approaches provided by the system has been found to boost students' motivation, learning autonomy, and understanding of mathematics among high school (Moreno-Guerrero et al., 2020) and primary school students (Hubalovsky et al., 2019).The system has also been highly accepted among elementary students in Athens (Troussas et al., 2019).
Even though most studies confirmed the effectiveness of the adaptive e-learning system in improving student performance in the mathematics domain, there were several studies that observed no significant achievement because of using the adaptive elearning system (e.g., Clark & Kaw, 2019;Martin et al., 2020;Phillips et al., 2020).Some of the possible causes are the limitations of short-term performance measures, lack of personalized instruction, and student attendance and engagement issues.For instance, Clark and Kaw (2019) found no significant difference in academic achievement with adaptive e-learning in certain contexts, which they attributed to the reliance on short-term performance measures that may not accurately reflect long-term learning and comprehension.Meanwhile, Phillips et al. (2020), observed that even when adaptive e-learning systems were implemented, the lack of personalized instruction and student engagement limited their effectiveness.This lack of engagement could be due to various factors, including uninteresting or irrelevant content, poor user interface design, or a disconnect between the e-learning activities and students' personal academic goals.
These observations highlight the need for comprehensive usability studies to address these issues.Usability studies can provide valuable insights into how adaptive e-learning systems are used in real-world educational settings, identify specific areas, where these systems fall short in meeting student needs, and suggest improvements.

Usability Evaluation in E-Learning Systems
Usability, typically measured by user satisfaction, efficiency, and overall effectiveness, is crucial in shaping a user's experience and, consequently, the success of any digital tool (Nielsen, 1994).For secondary school students, the usability of an adaptive elearning platform can significantly influence their level of engagement, motivation, and, ultimately, academic outcomes.Usability in e-learning systems is essential for determining how users perceive the system's effectiveness, especially in terms of ease of use, efficiency, and satisfaction.Grinberg and Hristova (2012) emphasize that the success of e-learning hinges on its usability.They propose three approaches to assess usability: heuristic evaluation by experts, user testing, and questionnaires for user satisfaction.
Beyond that, Asarbakhsh and Sandars (2013) spotlight two primary usability testing methods: direct feedback from learners through questionnaires or interviews and observational methods, like scenario observation and thinking aloud.While surveys offer structured feedback, observational methods provide insights into real-time user interactions.Meanwhile, Balaban et al. (2011) categorize usability evaluation methods into two: usability inspection methods and empirical testing.While the former involves expert reviews, the latter, also known as user-based methods, often takes the form of user evaluation surveys.
Usability surveys have consistently served as a critical instrument for gauging the efficiency and user experience of digital platforms, especially in e-learning systems.Among the most recognized are questionnaire for user interaction satisfaction, which emphasizes the human-computer interface, and software usability measurement inventory, which focuses on software quality from the user's perspective.Usefulness, satisfaction, and ease of use questionnaire, with its comprehensive approach, assesses user satisfaction, ease of use, and usefulness.Meanwhile, system usability scale provides a quick and reliable tool to measure the usability of a system.These surveys, with their established methodologies, have been widely adopted due to their reliability and ease of implementation.
However, when it comes to e-learning systems, more specialized surveys like those proposed by Balaban et al. (2011) and Zaharias and Poylymenakou (2009) offer a detailed approach.Balaban et al. (2011) presents a holistic view of e-learning, encompassing aspects from content to personalization.Zaharias and Poylymenakou (2009), on the other hand, provides a thorough framework, touching upon aspects such as visual design, navigation, and motivation to learn.The advantage of these surveys lies in their tailored approach to e-learning.Unlike generic usability surveys, they delve deeper into the unique challenges and requirements of online learning platforms.The detailed evaluation ensures that the feedback obtained is directly relevant, facilitating more focused enhancements and an enhanced user experience of the e-learning system.

DEVELOPED MYTHEMATIX PROTOTYPE
This section showcases the structure of the Mythematix prototype that was developed in this study.Essentially, there are five main structures within the Mythematix prototype, which consist of the diagnostic test, adaptive navigation, learning activities, adaptive recommendation, and enhanced features.

Diagnostic Test
Diagnostic module encompasses diagnostic test questions, the sidebar display, and the diagnostic report (Figure 1).This diagnostic test, tailored for the progression's topic, aims to pinpoint students' strengths and weaknesses.The diagnostic test format was multiple-choice questions (MCQ), deemed efficient for assessing mathematical concepts.A one-hour duration was considered suitable for the test, comprising 20 MCQs split between arithmetic and geometry progressions.

Adaptive Navigation
One of the adaptive features included in the Mythematix prototype is adaptive navigation, a technique used to guide learners through learning content and activities most appropriate for their individual needs and abilities.In this study, adaptive navigation is embodied in my learning path.My learning path is generated based on student performance in the diagnostic test.The outcome of my learning path developed in the Mythematix prototype is shown in Figure 2.

Adaptive Recommendations
Adaptive recommendations are made based on student performance in activities such as quizzes, exercises, and assessments.The system will recommend tasks for students to complete to master specific learning bits or learning objectives.If a student gets a question wrong, the Mythematix is supposed to recommend a list of related tasks to complete (Figure 7).

Enhanced Feature
The math keyboard and the adaptability to various notational styles for mathematics expression are the enhanced features included in the Mythematix prototype.Incorporating the math keyboard allows users to enter input in mathematical symbols, equations, and expressions when necessary.As many other e-learning systems restrict the mathematical expressions to be input in certain formats due to their limitations, Mythematix prototype ensures that users can easily input mathematical expressions in ways they're familiar with, while also possessing the capability to recognize equivalent answers, accommodating various valid representations of the same mathematical solution.This feature makes the system more user-friendly, allowing students to use their preferred mathematical notation without confusion (Figure 8).

METHODOLOGY
This section outlines the methods and procedures employed in the study to ensure a systematic and rigorous approach to the research.

Research Design
This study adopts a mixed-methods approach, combining both qualitative and quantitative research methods to provide a comprehensive understanding of the students' perceptions of the Mythematix prototype.

Instrument
A structured usability survey was administered to the students to gauge their perceptions of the Mythematix prototype's usability.The survey includes Likert scale questions and open-ended questions to capture both quantitative and qualitative data.
In the study, a usability survey was meticulously designed to delve into the practical usability, strengths, and weaknesses of the Mythematics prototype.This survey was adapted from four established usability instruments: usability evaluation method for e-learning application by Zaharias and Poylymenakou (2009), e-learning online course evaluation survey by Balaban et al. (2011), questionnaire for user interface satisfaction by Chin et al. (1998), and software usability measurement inventory by Kirakowski and Corbett (1993).The adaptation process involved identifying and merging redundant constructs, eliminating less relevant ones, and refining items for clarity and relevance.
The final survey features 10 distinct constructs: educational content, user interface, personalisation, visual design, assessment, navigation, technical elements, accessibility, user support, and general characteristics.These constructs were represented through 45 items, measured using a 5-point Likert scale.Initially drafted in English, the survey was subsequently translated into Malay language through the back-translation method and underwent an expert validation process to ensure its efficacy and relevance.

Participants
Participants in this study were 30 secondary school students aged 16 years old, or form four students, of a public secondary school in Sepang, Selangor, Malaysia.These students were specifically selected to evaluate the Mythematix prototype based on two important criteria: 1.The students must be secondary school students at a public school following the the Malaysian Ministry of Education's syllabus.
2. Since the Mythematix prototype's content focused on progression, the selected students had to have learned the topic of arithmetic and geometry progressions based on the Ministry's form four school syllabus.
The participants were recruited using systematic randomized sampling method, wherein researchers use a random starting point in a population and then choose members at fixed intervals.This method ensures that every student has an equal chance of being selected, and it helps in achieving a representative sample of the student population, which in this study comprised 30 students, with a gender distribution of two-thirds female (n=20, 66.7%) and one-third male (n=10, 33.3%).

Data Collection Procedure
During the initial session, the participating students were briefed about their involvement in the Mythematix intervention course (MIC).They were given an hour as an initial exposure to the Mythematix prototype.A demonstration of the application was provided, and participants were encouraged to ask questions if they encountered any issues or difficulties.Each student was required to log into the Mythematix prototype using their respective emails.
MIC was conducted in computer lab A under the supervision of a facilitator.The course schedule for MIC included sessions for diagnostic tests, and self-learning on topics of arithmetic and geometric progression.MIC approach was predominantly selflearning, where students engaged in activities based on the Mythematix system's recommendations and had access to video tutorials within the system for assistance.The researcher's role in MIC was primarily to monitor technical issues and maintain student discipline.The course took place over four days (Figure 9).The study began with demonstration on using Mythematix, individual students' registration processes in the system, and a diagnostic test to identify their pre-existing mathematics performance, followed by individual students' utilization of Mythematix for three days, and concluded with usability testing session.The total duration for the course was 15 hours.

Data Collection & Data Analysis Method
A structured usability survey was administered to the students using Google Form to gauge their perceptions of the Mythematix prototype's usability.The survey included 45 Likert-scale questions and one open-ended question to capture both quantitative and qualitative data.
In this study, the usability survey was used as a diagnostic tool.The quantitative data from the usability survey was analyzed descriptively.Then, the descriptive results were used to identify the most and least useful aspects, as well as the strengths and weaknesses of the Mythematix prototype from the students' perspectives.
Following the descriptive analysis of the usability survey, qualitative data was also obtained through one open-ended question at the end of the survey.The question was designed to gather comments or suggestions from students, offering a deeper understanding of their experiences and perspectives.This qualitative data was then subjected to thematic analysis.During this process, responses were meticulously reviewed to identify recurring themes and patterns.These themes provided detailed insights into specific areas of the Mythematix prototype those students found particularly beneficial or challenging, thereby supporting the quantitative results of the study for the future refinement and enhancement of the system.

RESULTS
The results of this study consist of two parts: the quantitative and the qualitative results.

Quantitative Data Analysis of the Usability Survey
This section presents a descriptive analysis of the data from all constructs in the usability survey.The aim was to determine the mean scores for each construct, which would highlight the strengths and weaknesses of the Mythematix prototype.The result from this section will answer the first research question: "How do the students perceive the usability of the developed Mythematix prototype?"Table 1 displays the mean scores for all the constructs and Figure 10 provides a visual representation of the distribution of these scores.
In summary, Table 1 and Figure 10 collectively provide a comprehensive overview of student perceptions regarding the usability of the Mythematix prototype, allowing for a quick comparison of how each aspect of the Mythematix prototype was rated by students.Constructs with higher mean scores are perceived favourably.For instance, general characteristics and accessibility have some of the highest mean scores, suggesting positive student perceptions in these areas.Conversely, constructs like technical element and user support have lower mean scores, indicating areas, where the prototype may need improvement.
The mean scores have been categorized into three levels of excellence to further facilitate data interpretation as perceived by the participants.The three levels are: weak, moderate, and excellent (Table 2).The established thresholds for categorizing the mean scores into levels of excellence are justified based on the equitable distribution criteria.The scale is divided into three nearly equal segments to ensure a balanced representation of responses across the spectrum.Table 3 showed the frequency of constructs categorized as excellent, moderate, and weak.3, it is evident that eight constructs fall into the excellent category, indicating that many participants have a positive perception regarding the usability aspects of these constructs.Additionally, the user support and technical element constructs are categorized as moderate, while no constructs are classified as weak.While research question 1 examines the aspect of usability according to the construct, research question 2 focuses on the aspect of usability based on the five items that received the highest mean scores and the five items that received the lowest mean scores.This approach allows for a more comprehensive evaluation of the usability aspect.Analyzing the top and bottom five items is additionally helpful to pinpoint the strengths and weaknesses of Mythematix prototype.Recognizing the top items indicates what's working well, while bottom items highlight areas needing attention.This targeted approach ensures that refinements are made, where most needed, optimizing the system effectiveness.The findings from the analysis address the second research question:

Based on Table
"What are the key strengths and areas of improvement for the Mythematix prototype as identified by user feedback?" The top and bottom items are presented in Table 4.
From Table 4, the Mythematix prototype has received varied feedback, with certain features standing out positively and others indicating areas for improvement.Starting with the strengths, the assessment feature, which offers a self-assessment or test for every chapter, has been particularly well-received.Another notable strength is the platform's general characteristics.The recommendation of Mythematix to peers with similar educational needs indicates a high level of user satisfaction and trust in the platform's capabilities.In terms of personalisation, the ability for users to review their scores for both activities and assessments highlights the platform's commitment to transparency and individualized learning experiences.The results also pinpoint that the navigation in Mythematix allows users to control their learning activities, and the students loves the educational content in Mythematix, which have been organized into small and digestible modules, enabling students to mastery the learning unit at a micro level.
However, there are areas that require attention.The user support feature that should help from instructors and technical personnel through various communication channels, seems to fall short of user expectations.The technical element of the platform has also raised concerns.Users have reported stability issues, which might include software bugs, computer crashes, and system freezing.This became apparent when many students disagreed that Mythematix is free from technical issues.Moreover, students concurred that they often felt disoriented or uncertain about the next steps when confronted with challenges.

Qualitative Data Analysis of Usability Survey
The qualitative data of this study was collected in the form of comments and suggestions from the user.The results from this qualitative data analysis will support and further strengthen the findings from the quantitative data analysis.The qualitative findings not only validate the quantitative results but also add depth and context, ensuring a holistic interpretation of the data.
Based on open-ended feedback, participants shared their experiences and feelings while using the Mythematix prototype.Out of the 30 students who participated, six did not provide any feedback regarding the adaptive e-learning implementation, leaving 24 students who offered a variety of comments.These comments were grouped according to related themes.Finally, the number of comments or suggestions analyzed by theme is presented in Table 5.The analysis above shows that the feedback revolves around four main themes: content expansion and diversification, user experiences and interface improvement, video content feedback, and learning motivation and engagement.

Content expansion & diversification
This theme received the highest frequency of feedback from the students.The students suggested that more exercises should be incorporated within Mythematix (n=4), especially higher order thinking skills (HOTS) questions (n=3).They also requested other topics in additional mathematics to be added to the system (n=3).Interestingly, the students also recommended that adaptive elearning systems such as Mythematix should also be implemented for subjects other than mathematics (n=4).

Video content feedback
For the video content feedback theme, students clearly indicated that they would like more videos added to Mythematix (n=5).Apart from highlighting technical issues related to video playback (n=2), students also suggested that the Mythematix system include a feature that provides solutions in the form of video tutorials (n=2).While there were comments stating that the provided video tutorials are suitable for students who get bored easily (n=1), there were also suggestions for Mythematix to offer longerduration videos to be clearer and more suitable for slower-paced students (n=2).

Learning motivation & engagement
Regarding the learning motivation and engagement theme, students noted the motivational impact of the platform (n=4), especially for learning additional mathematics, and found the platform to be very useful (n=2).

User experience & interface improvement
Lastly, under the user experiences and interface improvement theme, there were suggestions for the Mythematix system to mark the student's last activity, making it easier for students to resume their activities each time they access the Mythematix system (n=2).Additionally, there were also suggestions for the Mythematix system to provide a history feature to help students track their progress (n=1).

DISCUSSION
This research proposed the prototype of an adaptive e-learning system to facilitate mathematics learning for secondary school students.By means of adaptive learning, the prototype tailors the learning experience to individual student needs, ensuring a more personalized and effective learning journey.Drawing from both quantitative and qualitative feedback, this section delves into a comprehensive discussion on the system's usability, its strengths, areas of improvement, and the implications of these findings.

How Do Students Perceive Usability of Developed Mythematix Prototype?
In interpreting the results from the usability survey, it is evident that the Mythematix prototype has successfully met user expectations and requirements.Most of the constructs, specifically eight out of ten, received excellent scores.This indicates that in most areas, the platform excels and aligns well with the students' needs and preferences.However, two constructs gained only moderate scores, suggesting that while these areas meet basic user expectations, there is room for improvement to elevate them to the same standard as the other constructs.
Diving deeper into the quantitative data analysis of the usability survey, this study discerns thorough insights into how students perceive the Mythematix prototype's usability.The survey employed various constructs to gauge different facets of the user experience, and the mean scores derived from these constructs serve as indicators of the platform's strengths and potential areas for enhancement.
The general characteristics constructed receiving the highest score is a significant indicator of the overall appeal and effectiveness of the Mythematix prototype.The high general characteristics and user interface scores in the Mythematix prototype concurred with findings from Alqurni (2023), highlighting the significance of intuitive design and overall user experience in elearning platforms.
The general characteristics construct covers a wide range of aspects, including the prototype's functionality and value to users.A high score suggests that students find the prototype engaging and would recommend it to others.Additionally, the open-ended question's findings revealed a strong desire for content expansion and diversification of subjects within the Mythematix platform.The examples of student feedback based on this matter is, as follows: "Maybe it could be expanded to include other chapters or sub-units ... it might also be applicable to other subjects" (student 5).
"I hope there are other subjects besides additional mathematics" (student 14).
This desire for expansion and diversification further reinforces the prototype's perceived value; students want to use it for their current learning needs and envision it as a tool to support their learning across multiple subjects.
Following the general characteristics, the accessibility construct emerged as the second highest, shedding light on another crucial aspect of the Mythematix prototype's success.The accessibility construct evaluates how easily users can access, interact with, and benefit from the platform, regardless of any potential barriers or limitations they might face.A high score in this domain signifies several key insights, including ease of access, user-friendly design and stability.
The personalisation construct assesses the ability of the prototype to cater to individual learning needs and preferences.A positive score in this construct implies that students appreciate the tailored learning experiences offered by the prototype, allowing them to learn at their own pace and according to their preferences.This aspect of personalization is fundamental in an adaptive e-learning system, as supported by Chen and Huang (2023), which prioritize the unique learning journey of each student by adjusting content and assessment in real-time based on user feedback and interactions.
On the other hand, educational content construct evaluates the quality, relevance, and organization of the learning material presented within the prototype.A favourable score in this domain suggests that students find the content to be well-structured, relevant, and conducive to their learning experience.The responses to the open-ended questions showed that students predominantly commented on the instructional video, more so than on quiz questions, exercises, or assessment.This highlights the instructional video's crucial role in the Mythematix prototype, suggesting that it warrants further attention and refinement.The examples of few construction comments regarding the instructional video are, as follows: "The YouTube video is simple and concise, but some students may easily get bored.The video needs to be repeated several times because it cannot be paused" (student 12).
"I suggest increasing the number of tutorial videos so that we become more familiar with various types of questions" (student 17).
"I also suggest slowing down the video to make it easier to understand, as some students take longer to grasp a topic.It would help to show the calculations step by step" (student 18).
Meanwhile, the user interface construct pertains to the visual and interactive elements of the platform.A high score in this construct indicates that students find the interface intuitive, user-friendly, and aesthetically pleasing, which can significantly enhance the overall user experience.
Subsequently, the visual design construct assesses the platform's aesthetic appeal, layout, and overall visual presentation.A favourable score here implies that students find the design elements, colour schemes, graphics, and overall visual layout of Mythematix to be appealing and conducive to learning.It suggests that the visual elements enhance the platform's attractiveness and support and complement the educational content, making the learning experience more engaging and effective (Siahaan et al., 2022).
Next, the navigation construct evaluates the ease with which users can move through the platform, access content, and utilize features.The high score in the navigation construct indicates that the Mythematix prototype offers intuitive and user-friendly navigation, allowing students to easily access content and features.The adaptive navigation in Mythematix prototype dynamically adjusting paths based on user interactions and learning preferences, enhancing the learning experience by providing personalized and efficient navigation suited to individual learning paths.
The assessment construct, while being the last in the excellent category, remains an important component of the Mythematix prototype's user experience.This construct evaluates the prototype's ability to provide timely, relevant, and effective evaluations of students' understanding and progress.A commendable score in the assessment domain indicates several key insights.First, the prototype offers immediate feedback post-assessment, allowing students to understand their performance, identify areas of improvement, and take corrective actions promptly.Then, the prototype employs a variety of assessment techniques, from quizzes and exercises to assessment, ensuring a comprehensive evaluation.However, the findings from the open-ended questions highlighted that students require more questions, especially HOTS questions.The open-ended feedback related to the questions are, as follows: "More SPM-equivalent questions should be included in Mythematix" (student 21).
"Add several Higher Order Thinking Skills (HOTS) questions, similar to SPM question examples" (student 7).
"Add extensive practice for each chapter" (student 10).
The high scores in navigation and assessment construct align with the findings from Khan et al. (2022), highlighting AI's role in enhancing navigation and assessment in e-learning platforms, making them more responsive to student needs.
Conversely, the technical element and user support constructs had lower scores.The user support construct gauges the effectiveness of the support mechanisms in place, such as help documentation or customer support.A less favorable score in this domain suggests that students might have faced challenges seeking assistance or resolving queries while using the prototype.
Finally, the technical element assesses the stability and reliability of the platform, and a lower score in this construct points to technical glitches or issues that might have hindered the user experience.
Responses to the open-ended question brought to light multiple technical challenges associated with video playback.These challenges arose from the Mythematix prototype being hosted on a temporary server.Should the Mythematix system be expanded in the future, a dedicated server will be essential to support a high volume of concurrent users.
Apparently, the moderate scores in technical element and user support constructs reflect common challenges noted in adaptive e-learning systems.A study from Vapiwala and Pandita (2022) emphasizes the need for robust technical infrastructure and effective support mechanisms, essential for a seamless learning experience.

What Are Key Strengths & Areas of Improvement for Mythematix Prototype as Identified by User Feedback?
The Mythematix prototype, as illuminated by user feedback, showcases a blend of commendable strengths and areas ripe for refinement.Among its strengths, the platform shines in its assessment capabilities, offering timely, relevant feedback that reinforces student understanding.Mythematix prototype general characteristics denote a holistic, user-centric design that has garnered significant appreciation from its users.The prototype's emphasis on personalisation is evident, with tailored learning experiences allowing students to navigate their academic journey at their own pace.This is further enhanced by adaptive navigation, which intuitively guides students based on their interactions and progress.The quality and structure of the educational content have also been highlighted, indicating wellorganized, relevant material conducive to effective learning.Among the positive feedback given by students through open-ended questions that reflects their satisfaction with the use of Mythematix are, as follows: "I am able to motivate myself to continue learning additional mathematics and find it enjoyable" (student 2).
"I enjoy learning using this application.It's very useful and great" (student 6).
"Thank you for introducing us to Mythematix.I am glad to be excelled in the chapter on progressions.The video is suitable for students who quickly get bored with studying" (student 19).However, like any evolving platform, Mythematix has opportunities for enhancement and growth.One key area is content expansion, particularly in diversifying the types of instructional videos offered.Based on the feedback received, it is suggested that Mythematix should consider diversifying its video tutorial offerings to cater to different learning styles and paces.This could include creating a range of videos varying in length and depth.Short, engaging videos could be designed for students who prefer quick, concise explanations, while longer, more detailed videos could better serve those who benefit from a slower, more thorough approach to learning.This variety would ensure that Mythematix meets the diverse needs and preferences of its user base.Moreover, integrating diverse subjects and content areas could further enhance the platform's appeal, making it a comprehensive e-learning solution for students.
Additionally, addressing technical issues, especially those related to video playback, is imperative.Resolving these glitches is essential for providing a seamless, uninterrupted learning experience, ensuring that users can engage with content effectively.Finally, enhancing user support is a critical area of focus.Feedback indicates a need for more robust support mechanisms, including more comprehensive and accessible resources, to assist users in navigating challenges and maximizing their learning potential on the platform.
The Mythematix prototype represents a significant advancement in mathematics education through adaptive e-learning.Its adaptive and personalized approach caters to the varied needs of mathematics learners.The platform's focus on high-quality assessments and content highlights its commitment to enhancing mathematics education.While there are areas for improvement, such as user support and technical refinement, Mythematix is poised to become an integral tool in mathematics learning.As it continues to evolve and address these challenges, Mythematix aims to make a substantial impact on the way mathematics is taught and learned, embodying the future of adaptive e-learning in this field.

Figure 9 .
Figure 9. Research procedure of the study (Source: Authors' own elaboration)

Figure 10 .
Figure 10.Distribution of mean scores for each construct in usability survey (Source: Authors' own elaboration)

Table 1 .
Mean scores for all constructs

Table 2 .
Mythematix prototype excellence scale as perceived by participants

Table 3 .
Frequency of constructs nominated as excellent, moderate, & weak

Table 4 .
Top-five & bottom five items

Table 5 .
Thematic analysis result of open-ended question