Page 30 - 2023-Vol19-Issue2
P. 30
26 | Andreswari & Syahputra
also been mentioned in [7] where it is necessary to compare for investigation of previous learning data, to ensure their
one process model with other process models. Furthermore, balance and suitability for further data analysis [11].
to make comparisons, an approach is used using the process
cube [8]. Thus, each dimension can be compared to determine Subsequent research proposes a systematic process for
whether there is discrimination or not in the assessment of framing, detecting, documenting, and reporting the risk of
lectures. unfairness. The results of a systematic approach are combined
into a framework called FairEd, which will help decision
II. RELATED WORKS makers to understand the risks of unfairness along the environ-
mental and analytical fairness dimensions. This tool makes
Research on fairness learning analytics is increasingly at- it possible to identify data containing a risk of unfairness,
tracting the attention of researchers who are concerned about identifying models to see that potentially unfair outcomes can
education. Studies related to fairness to answer some unfair be reduced without compromising performance [12].
problems to process modeling that can lead to unfair and
detrimental result. Related to educational research of fair- Furthermore, the following research advocates the use of
ness, there are some discoveries. Based on [9], the results simulation as the main tool in studying algorithm fairness.
indicate that the teachers’ conceptions of fairness were in- The study explores three examples of previously studied dy-
fluenced by three reasons: (a) individual mechanisms, (b) namic systems in the context of equitable decision making
social mechanisms, and (c) dialectical relationships between for bank loans, college admissions, and attention allocation.
individual and social mechanisms. Each of them was the fac- By analyzing how learning agents interact with these systems
tor from teacher’s perspective that influenced the assessment. in simulations, it is shown that static analysis does not pro-
Moreover, the study [10] yielded three categories: distributive vide a complete picture of the long-term consequences of
justice, procedural justice, and interactional justice. The study ML-based decision systems. This study provides an open-
concludes by proposing a range of implications for different source software framework that can be extended to implement
testing stakeholders. fairness-focused simulation studies and further reproducible
research [13].
Mahnaz Sadat Qafari and Wil van der Aalst in the Fairness-
Aware Process Mining research, the results of the application III. METHODS
of machine learning and data mining techniques are not always
acceptable. In many situations, this approach tends to make an In the process of assessing the website application develop-
overt or unfair diagnosis and applying it can lead to erroneous ment course, there are several factors that are considered to
or even discriminatory conclusions. This study presents a assess fairness in determining student graduation. These fac-
solution to this problem by creating a fair classifier for such tors include student attendance, lecturers, gender, and student
situations. Unwanted effects are eliminated at the expense of batch and assessment index. Online lectures result in students
reducing the accuracy of the resulting classifier [2]. and lecturers not being able to meet in person, but attendance
can still be calculated through activities carried out during lec-
Research by Qian Hu and Huzefa Rangwala highlights tures. The existence of differences in lecturers in each parallel
concerns about model bias and injustice causing unfair out- class is also possible to have an impact on assessment bias
comes for some groups of students and negatively impacting due to different standards. Although using the same semester
their learning. This study demonstrates by concrete examples learning plan, the implementation and treatment of each lec-
that biased educational data leads to biased student modeling. turer may be different. Furthermore, gender differences are a
This research encourages the development of equitable for- factor that is often used as an identification of discrimination
malization and algorithms for educational applications. The in an assessment [12]. The last factor to be considered in
proposed formalization is individual fairness and group fair- this study is the class, where students who repeat or are more
ness. This study proposes a model based on the idea that senior are also possible to be one of the causes of differences
predictions for students (identifying students at risk) should in assessment standards. Assessment in the form of numbers
not be influenced by their sensitive attributes. The proposed is not included because it is hypothesized that the assessment
model is proven to be effective in removing the bias from this index is carried out in a discriminatory way without looking
prediction and hence, making it useful in helping all students at the scoring rubric. Based on Fig. 1, the causal factors of the
[1]. lecture process discrimination in a fishbone diagram can be
divided into machine, man, measurement, method and milieu.
Other research examines possible forms of discrimination, Material is not included because the lecture process has no
as well as ways to measure and define fairness in Virtual material differences and moreover the data is not recorded in
Learning Environments (VLE). Prediction of student course the lecture.
outcomes was carried out on the VLE dataset and analyzed
with due regard to fairness. Two measures are recommended