Математика и Информатика

https://doi.org/10.53656/math2025-6-4-obc

2025/6, стр. 554 - 577

OPTIMIZATION VS BOOSTING: COMPARISON OF STRATEGIES ON EDUCATIONAL DATASETS TO EXPLORE LOW-PERFORMING AT-RISK AND DROPOUT STUDENTS

Ranjit Paul
OrcID: 0009-0007-3046-7532
E-mail: rs_ranjitpaul@dibru.ac.in
Centre for Computer Science and Applications
Dibrugarh University Dibrugarh 786004 India
Asmaa Mohamed
OrcID: 0000-0003-4314-463X
E-mail: asmaamohamed89@azhar.edu.eg
Department of Mathematics Faculty of Sciences
Al-Azhar University (girls branch) Egypt
Peren Jerfi Canatalay
OrcID: 0000-0002-0702-2179
E-mail: peren.canatalay@istinye.edu.tr
Department of Computer Engineering Faculty of Engineering and Natural Sciences
Istinye University 34396
Istanbul Turkey
Ashima Kukkar
OrcID: 0000-0001-7664-1005
E-mail: ashima@chitkara.edu.in
Chitkara University Institute of Engineering and Technology
Chitkara University Punjab India
Sadiq Hussain
OrcID: 0000-0002-9840-4796
E-mail: sadiq@dibru.ac.in
Centre for Computer Science and Applications
Dibrugarh University Dibrugarh 786004 India
Arun K. Baruah
OrcID: 0009-0003-2115-9977
E-mail: arun@dibru.ac.in
Department of Mathematics
Dibrugarh University Dibrugarh Assam
Jiten Hazarika
OrcID: 0009-0001-3464-1071
E-mail: jiten_stats@dibru.ac.in
Department of Statistics
Dibrugarh University Dibrugarh Assam
Silvia Gaftandzhieva
OrcID: 0000-0002-0569-9776
E-mail: sissiy88@uni-plovdiv.bg
University of Plovdiv Paisii Hilendarski Plovdiv (Bulgaria)
Esraa A. Mahareek
OrcID: 0000-0002-9042-248X
E-mail: esraa.mahareek@azhar.edu.eg
Department of Mathematics Faculty of Sciences
Al-Azhar University (girls branch) Egypt
Abeer S. Desuky
OrcID: 0000-0003-1661-9134
E-mail: abeerdesuky@azhar.edu.eg
Department of Mathematics Faculty of Sciences
Al-Azhar University (girls branch)
Rositsa Doneva
OrcID: 0000-0003-0296-1297
E-mail: rosi@uni-plovdiv.bg
University of Plovdiv Paisii Hilendarski Plovdiv (Bulgaria)

Резюме: The paper proposes a comprehensive student academic performance prediction approach by integrating machine learning with metaheuristic optimization. Initial models (Logistic Regression, Decision Tree, Random Forest, MLP) were refined using boosting techniques (Gradient Boos ting, XGBoost, LightG BM), with XGBoost achieving 95.59% accuracy. Eight modern optimization algorithms were applied for feature selection to enhance model efficiency and interpretability, with t he Grey Wolf Optimizer and the Heap-Based Optimizer outperforming others in key metrics. Support Vector Machine algorithms app lied after feature selection strengthened the predictive capability of the selected feature subsets. The research outcomes demonstrate that uniting boosting approaches with feature selection algorithms enables the creation of reliable and scalable predictive models that detect student success and failure earlier.

Ключови думи: Machine Learning; Optimization Algorithms; Educational Data Mining; Ensemble Models; Boosting Algorithms.

Ranjit Paul1), Asmaa Mohamed2), Peren Canatalay3), Ashima Kukkar4), Sadiq Hussain1), Arun Baruah1), Jiten Hazarika1), Silvia Gaftandzhieva5), Esraa Mahareek2), Abeer Desuky 2), Rositsa Doneva5)

1)Dibrugarh University, Dibrugarh (India)

2)Al-Azhar University (girls branch), Cairo (Egypt)

3)Istinye University, Istanbul (Turkey)

4) Chitkara University, Punjab (India)

5) University of Plovdiv “Paisii Hilendarski”, Plovdiv (Bulgaria)

Abstract. The paper proposes a comprehensive student academic performance prediction approach by integrating machine learning with metaheuristic optimization. Initial models (Logistic Regression, Decision Tree, Random Forest, MLP) were refined using boosting techniques (Gradient Boos ting, XGBoost, LightG BM), with XGBoost achieving 95.59% accuracy. Eight modern optimization algorithms were applied for feature selection to enhance model efficiency and interpretability, with t he Grey Wolf Optimizer and the Heap-Based Optimizer outperforming others in key metrics. Support Vector Machine algorithms app lied after feature selection strengthened the predictive capability of the selected feature subsets. The research outcomes demonstrate that uniting boosting approaches with feature selection algorithms enables the creation of reliable and scalable predictive models that detect student success and failure earlier.

Keywords: Machine Learning; Optimization Algorithms; Educational Data Mining; Ensemble Models; Boosting Algorithms. definitions and varied calculation methods (Xu & Kim, 2024) lead to reporting discrepancies, complicating efforts to implement effective student retention strategies. Higher education institutions (HEIs) use several monitoring techniques to assess student performance by tracking course advancement and analyzing academic standing each semester (Chen et al., 2014).

Technological advancements and increased data availability have established Educational Data Mining (EDM) as a specialized research field (Apriyadi & Rini, 2023). EDM uses data mining techniques to find actionable patterns in educational data. Its predictive models analyze student performance to help HEIs address dropout risks. However, standard predictive techniques still face challenges related to interpretability, scalability, and computational efficiency (Shekhar et al., 2020).

Data preprocessing, specifically feature selection, is crucial for optimizing data mining systems by removing redundant and noisy data. This process improves algorithm performance and en ables classifiers to achieve higher accuracy. The two main types of feature selection are: filter methods, which are computationally efficient but can not detect feature dependencies, and wrapper methods, like Linear Discriminant Analysis (LDA) and K-Nearest Neighbor (KNN), which are more effective at identifying complex dependencies but are computationally intensive and thus limited to smaller datasets. Identifying the optimal feature subset remains a challenge, as efficient search mechanisms (complete, random, or heuristic) risk overlooking optimal solutions (Hussain et al., 2020; Farissi et al., 2022; Punitha & Devaki, 2024; Ajibade et al., 2019). Heuristic search mechanisms offer an effective and efficient framework for problem-solving. Specifically, metaheuristic algorithms like Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO) demonstrate exceptional capability in feature selection. By replicating natural processes and employing probabilistic rules, these algorithms efficiently navigate large parameter spaces and escape local optima, making them well-suited f or complex, high-d imensional datasets. This enables improved feature selection quality and more effective predictive models, particularly in applications like student performance prediction (Kukkar et al., 2023; Kukkar et al., 2024).

Using data to boost student retention is a key goal for university administrators. However, the sheer volume of student data can be overwhelming, requiring advanced tools to identify and help at-risk students proactively.

This study presents an integrated solution combining Machine Learning (ML) and metaheuristic methods to predict student academic performance. It equips teachers, administrators, and policymakers with a predictive tool for tracking at-risk students and implementing effective interventions. Such a tool transforms large-scale student data into actionable intelligence, enabling the strategic allocation of support services and the timely implementation of targeted interventions designed to improve student outcomes and reduce attrition. The proposed EDM approach addresses current deficiencies, providing a comprehensive framework for fostering academic success and reducing dropout rates in HEIs. To achieve this, the research focused on developing an accurate predictive analytics model using historical academic and demographic data; applying advanced data preprocessing techniques for improved data quality; implementing a range of advanced ML algorithms (Logistic Regression (LR), Decision Tree (DT), Random Forest (RF), XGBoost, LightGBM) and metaheuristic techniques (Mud Ring Algorithm (MRA), Archimedes Optimization (AO), Jellyfish Search (JS), Ant Lion Optimizer (ALO ), Grey Wolf Optimizer (GWO), Whale Optimization Algorithm (WOA), Heap-Based Optimizer (HO), Equilibrium Optimizer (EO)) to enha nce prediction accuracy, conducting feature selection and importance analysis using RF; identifying 12th-grade percentage, CGPA, and gender as key predictors; performing a comparative analysis of boosting versus optimization techniques for feature selection to improve classifier efficiency and predictive accuracy; utilizing data visualization (histograms, heatmaps) to analyze patterns and relationships; statistically validating findings through cross-validation and comparisons with state-of-the-art methods; ensuring the model is computationally efficient and scalable for diverse educational datasets; providing data-driven insights for targeted interventions to improve learning outcomes.

The paper's structure includes Related Work (Section 2), Proposed Methodology (Section 3), Experimental Results (Section 4), Discussion (Section 5), and Conclusion (Section 6).

2. Related work

Ma (2024) enh anced student p erformance p rediction by optimizing an RF Classifier with Electric Charge d Particles Optimization (ECPO) and Artificial Rabbits Optimization. Analyzing 4,424 student records, their optimized model demonstrated higher predictive precision and better alignment with actual values, proving bio-inspired algorithms effective for educational decision-making.

Thaher et al. (2021) developed a Student Performance Predictive model using an enhanced WOA (EWOA) for automatic feature selection. Their approach integrated the Sine Cosine Alg orithm, a Logistic Chaotic Map, and an Adaptive Synthetic Sampling to address data imbalances. This method, particularly with LDA, show ed superior reliability and enh anced prediction accuracy compared to other classifiers and feature selection methods on real educational datasets.

Hasheminejad & Sarvmili (2019) introduced S3PSO, a discrete PSO method for forecasting student outcomes via rule-based prediction. Using Support, Confidence, and Comprehensibility metrics, S3PSO generated understandable rules from the Moodle dataset, achieving a 31% fitness improvement over standard methods like CART, C4.5, and ID3. It also outperformed benchmark algorithms (Support Vector Machine (SVM), KNN, Naive Bayes (NB), Neural Networks (NN), APSO) by 9% in student performance forecasting accuracy.

Turabieh et al. (2021) d eveloped HHO-based dyn amic controllers with KNN clustering to overcome early stagnation and local minima in student performance feature selection. Their HHO-enhanced model, particularly with Layered Recurrent NN and Artificial NN (ANN), achieved the highest accuracy on UCI data for early prediction of student outcomes.

Song (2024) integ rated KNN with Honey Badger Optimization (HBO) and the Arithmetic Optimization Alg orithm (AOA) to create the KNHB prediction system. This model excelled in both prediction and classification tasks for G1 and G3 data sets, demonstrating hig h accuracy (0.921) and precision (0.92) for G 3. The KNHB model also demonstrated exceptional precision as a G1 value forecaster, with a ccuracy and precision scores of 0.899 and 0.90, respectively, in the prediction phase.

Ren & He (2024) enhanced a NB model for student performance prediction using Leader Harris Hawk’s Optimization and Alibaba and the Forty Thieves Algorithm. Their model achieved 0.891 accuracy and substantial precision, recall, and F1-scores, outperforming other tested models and improving student support. The improvement in prediction precision achieved through these methods helps educational institutions deliver better student support and improve academic results.

Hai & Wang (2024) improved Multilayer Perceptron Classification (MLPC) for student performance prediction by combining the Pelican Optimization Algorithm and the Crystal Structure Algorithm. Their MLPO2 approach, using appropriate fine-tuning and preprocessing, achieved a 95.78% success rate and effectively handled class imbalance and high dimensionality.

Li & He (2024) applied ML dimensionality reduction and optimized an Extra-Trees Classifier with Gor illa Troops Optimizer and Reptile Search Algorithm for student success prediction. Their ETRS model achieved 97.5% accuracy in G1 mathematics course prediction, demonstrating the promise of bio-inspired optimization for educational outcomes.

Goran et al. (2024) used metaheuristic optimization with a modified Sinh Cosh Optimizer to enhance Adaptive Boosting (AdaBoost) and XGB oost for student dropout risk prediction. Their approach demonstrated superior performance on real-world binary and multi-class datasets, with SHAP and SAGE explainability methods identifying key dropout triggers for targeted retention programs.

Cheng et al. (2024) evalu ated variou s ML techniques (RF, DT, KNN, MLP, XGBoost) and ANNs for student performance prediction. Their SVM–SMOTE data- balancing process significantly improved results, with an Enhanced Artificial Ecosystem-Based Optimization + XGBoost hybrid model achieving 0.9417 accuracy and a 0.9413 F1-score, confirming the success of combining ML with metaheuristics for precise student performance prediction.

Figure 1. Percentage Distribution of the Identified Research Gaps

Previous studies often neglected student demographic and socioeconomic factors, used limited datasets, and inadequately addressed missing values, outliers, and feature engineering (see Fig. 1). In contrast, the current study utilizes a diverse dataset encompassing enrollment, 10th/12th-grade scores, demographics (gender, caste), and specialized program information. To ensure strong model foundations, we establish a robust data preprocessing pipeline, including categorical encoding, missing-value imputation, and IQRbased outlier detection. While prior research frequently employed NB, DT, RF, and some bio-inspired algorithms for optimizat ion, our investigation specifically examines the predictive excellence of AdaBoost and Gradient Boosting. Furthermore, unlike studies using metaheuristic algorithms (e.g., ECPO, WO, ACO) for feature selection without explaining feature importance, this work utilizes Random Forest for feature importance analysis, identifying 12th-grade percentage, CGPA, and gender as the most influential predictors. The issue of imbalanced class distribution—often inadequately addressed in rela ted works, leading to bias ed outcomes—is resolved in our method thr ough SVM–SMOTE, improving r ecall and F1scores. Recognizing the computational intensity of some metaheuristic optimization techniques (e.g., Jellyfish Search, HBO) that limit their application to large datasets, our research prioritizes computational efficiency and scalability through performance evaluation and training time measurements. Finally, while current models struggle to identify key performance factors and lack direct comparative analysis between popular ML and metaheuristic optimization techniques, our study performs an extensive comparative analysis. In essence, the research fills these gaps by integrating extensive datasets, advanced preprocessing, optimized ML models, metaheuristic-based feature selection and classification with cl ass balancing, and efficiency checks. These enhancements yield predictive models with improved accuracy, interpretability, and scalability for student academic performance assessment.

3. Proposed methodology

This study develops a robust predictive model for student academic performance by integ rating diverse hi storical academic and demographic data obtained from educationa l institutions. The process involves dataset description, data preprocessing, experimental setup, model development, performance evaluation, and comparative analysis.

3.1. Dataset description

The dataset, provided by multiple educational institutions, comprises 19 numerical and 17 categorical features. It includes academic performance metrics like 10th and 12th standard examination scores and CGPA, alongside demographic details (gender, caste), program-specific data (major and minor subjects), and institutional identifiers (enrollment number, college name). Table 1 provides a detailed description of the attributes.

Table 1. Feature descriptions

FeatureDescriptionENROLLMENTUnique enrollmentnumber of eachstudent.ProgrammeThe programthe studentis enrolled in, such as B.A.College NameThe name of the college.MAJORThe major subjectchosenby the student, such asEducation.MINORThe minorsubjectchosenby the student, such asSociology or Political Science.GENDERGender of thestudent: MALE or FEMALE.
FeatureDescriptionAGEAge ofthe student, expressed in years,months, and days.CASTEThe caste of thestudent: UR (Unreserved), ST (ScheduledTribes), SC (Scheduled Castes),OBC (Other BackwardClass)X PASSINGYEARThe year the student passed their10th standardexamination.XPERCENTAGEPercentage scored in the 10th standard examination.XII PASSINGYEARThe year thestudent passed their12th standardexamination.XII STREAMThe stream chosen by thestudent in the 12th standardexamination.XII MAXIMUMMARKSThe maximum possible marks in the 12th standardexamination.XII MARKSOBTAINEDMarks obtainedby the studentin the 12th standardexamination.XIIPERCENTAGEThe percentage scored in the12th standard examination.XII SUB 1, MAXMARK 1,OBTAINEDMARK 1Subject, maximummarks, and obtained marks forthe firstsubject in the 12th standard.XII SUB 2, MAXMARK 2,OBTAINEDMARK 2Subject, maximummarks, and obtained marks for thesecond subject in the 12th standard.XII SUB 3, MAXMARK 3,OBTAINEDMARK 3Subject, maximummarks, and obtained marks for thethird subject in the12th standard.XII SUB 4, MAXMARK 4,OBTAINEDMARK 4Subject, maximummarks, and obtained marks for thefourth subjectin the 12thstandard.XII SUB 5, MAXMARK 5,Subject, maximummarks, and obtained marks forthe fifthsubject in the 12th standard.
FeatureDescriptionOBTAINEDMARK 5XII SUB 6, MAXMARK 6,OBTAINEDMARK 6Subject, maximummarks, and obtained marks for thesixth subject in the12th standard.CGPACumulative Grade Point Average of the student.STATUSStatus of thestudent, which can be 1, 2, or 3.Status 1means dropout 2 means at-risk students, and 3means passed students.

3.2. Data Preprocessing

The accuracy of predictions depends heavily on maintaining data quality. The preprocessing steps include:

Handling Missing Values: The imputation method was used to

handle missing values while preserving the complete dataset structure.

Categorical Encoding: Variables such as gender and caste were transformed using One-Hot Encoding to convert them into a numerical format suitable for machine learning algorithms.

– Outlier Detection and Adjustment: Outliers were identified using the

Interquartile Range (IQR) method. Data points with zeros or

unusually high values (e.g., in “XII MAXIMUM MARKS” or CGPA) were careful ly reviewed an d adjusted to mitigate the effe cts of data entry errors.

– Feature Engineering: New attributes (e.g., average subject marks)

were generated to help the model capture a more holistic view of a student's academic performance beyond individual grades.

In addition to creating new attributes, the process also involved standardizing existing numerical features to ensure they are on a comparable scale for the machine learning algorithms and numerical features were standardized using techniques such as StandardScaler. Fig. 2 depicts a feature importance plot.

Figure 2. Feature Importance Plot

3.3. Data Visualisation and Reporting

To support interpretability and assess model performance, various visualization techniques were utilized.

A Confusion Matrix illustrates true versus false cl assifications for the best-performing models. In a multi-class problem like this one (with classes: Dropout, At-Risk, Passed), the ma trix is an n × n table, where 'n' is the number of classes. Typically, each row represents the actual class, while each column represents the predicted cla ss. The cells along the main dia gonal show the number of correct predictions, where the predicted class matches the actual class. The cells off the diagonal show the errors or misclassifications.

To calculate performance metr ics for a multi-class model, each class is typically evaluated in a “ one-vs-all” manner. For a ny given class, we can define four basic outcomes: True Positive (TP), True Negative (TN), False Positive (FP), and False Negative (FN) . These four outcomes are used to calculate several key metrics that measure a model's performance from different perspectives: Accuracy (measures the proportion of all predictions that the model got right – calculated as the sum of all correct predictions (the diagonal) divided by the total number of predictions), Precision (measures the accuracy of the positive predictions - Precision = TP / (TP + FP)), Recall (measures the model's ability to find all relevant instances of a cl ass – Recall = TP / (TP + FN)), F1-Score (provides a single score that balances both precision and recall - F1-Score = 2 * (Precision * Recall) / (Precision + Recall)), Geometric Mean (useful for imbalanced datasets because it measures the balance between the classification performance on both the majority and minority classes – calculated as the root of the product of the sensitivity (recall) of each class), Matthews Correlation Coefficient (MCC) (considered a very reliable evaluation metric because it produces a high score only if the prediction did well in all four categories (TP, TN, FP, FN) – its value ranges from 1 to +1, where +1 indicates a perfect prediction, 0 represents a random prediction, and 1 indicates a total disag reement between prediction and observation), Log Loss (measures the difference between the predicted probabilities and the actual outcomes, penalizing the model more heavily for being confident in an incorrect prediction - for this metric, a lower value is better, with a perfect model having a log loss of 0).

ROC Curves evaluate the trade-offs between sensitivity and specificity.

Training and Validation Graphs allow monitoring model convergence over epochs or iterations.

Comparison Tables visually represent the impact of each model compared to other techniques.

4. Experimental setup and model development

To ensure a robust evaluation of the predictive models, the experimental process was divided into two main phases.

4.1. Experiment 1: Evaluation of baseline Machin e Learning algorithms

We implemented LR, DT, RF, and Multi-Layer Perceptron as baseline models for comparative analysis and robust results. Ten-fold cross-validation ensured reliability, while RF-based feature importance analysis identified 12th-grade percentage, CGPA, and gender as key predictors of student performance. These predictors serve as key indicators that can help institutions identify students who may be at risk of falling behind. Detailed results are presented in Figs. 3–6.

Accuracy: 60.43%Classification Report:precisionrecallF1-scoresupport1.00.590.010.0218832.00.570.810.6735013.00.650.730.693252accuracy0.608636macro avg0.610.520.468636weightedavg0.610.600.538636

Figure 3. Evaluation metrics and confusion matrix related to logistic regression The experimental process comprised three main phases, with Experiment 1 dedicated to evaluating baseline models. Ten-fold cross-validation was meticulously employed to ensure robust results and a reliable model evaluation framework.

Accuracy: 88.83%Classification Report:precisionrecallF1-scoresupport1.00.730.770.7518832.00.940.930.9335013.00.930.920.923252Accuracy0.898636macro avg0.870.870.878636weightedavg0.890.890.898636

Figure 4. Evaluation metrics and confusion matrix related to the decision tree

One of the noteworthy outcomes of this study was the identification of influential predicto rs of student pe rformance throug h feature importance analysis, particularly using the Random Forest (RF) algorithm. The feature importance analysis, using RF, revealed 12th-grade percentage, CGPA, and gender as key predictors of student performance, highlighting their significance for predictive models. We evaluated LR, DT, RF Classifier, and Multi-Layer Perceptron: while LR and DT offered interpretability, RF and Multi-Layer Perceptron achieved higher predictive accuracy. This study underscores the importance of selecting optimal algorithms and feature combinations for accurate student pe rformance evaluation in educationa l predictive modeling.

Accuracy: 91.15%Classification Report:precisionrecallF1-scoresupport1.00.880.680.7718832.00.941.000.9735013.00.890.950.923252accuracy0.918636macro avg0.910.880.898636weightedavg0.910.910.918636

Figure 5. Evaluation metrics and confusion matrix related to Random Forest

Accuracy: 91.67%Classification Report:precisionrecallF1-scoresupport1.00.930.670.7818832.00.941.000.9735013.00.890.970.933252accuracy0.928636macro avg0.920.880.898636weightedavg0.920.920.918636

Figure 6. Evaluation metrics and confusion matrix related to MLP

4.2. Experiment 2: Evaluation of Boosting Algorithms

To establish optimal baseline performance, we assessed four boosting algorithms in the first phase: Gradient Boosting Classifier (GBC), XGBoost, and LightGBM. These models collectively demonstrated high performance with an average Accuracy of 92.86%, Precision of 93.31%, Recall of 92.99%, and F1 Score of 92.55%.

Accuracy: 92.86%Classification Report:precisionrecallF1-scoresupport1.00.960.700.8118832.00.941.000.9735013.00.900.990.943252Accuracy0.938636macro avg0.940.890.918636weightedavg0.930.930.928636Figure 7.Evaluation metrics and confusion matrix related to GBCAccuracy: 95.59%Classification Report:precisionrecallF1-scoresupport1.00.990.800.8918832.00.941.000.9735013.00.961.000.983252accuracy0.968636macro avg0.960.930.948636weightedavg0.960.960.958636

Figure 8. Evaluation metrics and confusion matrix related to XGBoost

Figure 9. XGBoost Log Loss

Accuracy: 93.86%Classification Report:precisionrecallF1-scoresupport1.00.970.740.8418832.00.941.000.9735013.00.930.990.963252accuracy0.948636macro avg0.940.910.928636weightedavg0.940.940.948636Figure 10.Evaluation metrics and confusion matrix related to LightGBM

Figure 11. Learning Curve and ROC Curve for LightGBM

Figure 12. Confusion Matrix for LightGBM

Table 2. Comparison of outcome of different boosting algorithms based on their evaluation metrics

ModelAccuracy%Precision%Recall%F1Score%Gradient Boosting92.9993.3192.9992.55XGBoost Model95.590.960.960.95Tuned Lightgbm94.1994.4594.2093.90

The XGBoost model (see Table 2), with a 96% precision a nd recall, effectively identifies student statuses (Dropout, At-Risk, Passed). Its high precision means that when the model predicts a status, it's correct 96% of the time, leading to a low false positive rate. The 96% recall signifies it finds 96% of all students in each category, resulting in a low false negative rate. An F1-score of 95% indicates a strong, reliable balance between these two metrics, making the model a hig hly e ffective tool for identifying student statuses.

Experiment 2 an alyzed the baseline performance of Gra dient Boosting Classifier (GBC), XGBoost, and LightGBM in predictive modeling. Performance was assessed using accuracy, precision, reca ll, and F1-score (Figs. 7, 8, 10), with XGBoost log loss shown in Fig. 9, and LightGBM's learning curve, ROC curve, a nd confusion matrix in Figs. 11-12. Table 2 provides a detailed comparison. GBC demonstrated exceptional results, achieving 92.99% accuracy, 0.9255 F1-score, 93.31% precision, and 92.99% recall. Its strong Kappa (0.8898) and MCC (0.8940) values confirmed reliability and competence with imbalanced data, with a training time of 48.4370 seconds. XGB oost and tuned LightGBM also showed impr essive performance, with F1 scores of 95.59% and 95% respectively, and the tuned LightGBM reaching 94.19% accuracy and 0.9390 F1-score. This demonstrates the effectiveness of GBC, XGBoost, and LightGBM in predictive modeling, each exhibiting distinct strengths across various metrics. Experiment 2's results underscore that optimal boosting algorithm selection depends on specific task requirements and performance objectives in classification tasks.

4.3. Experiment 3: Optimization-Based Feature Selecti on Followed by Classification

The second stage involved optimizing model performance by selecting the best possible features. Eight modern optimization techniques identified the most important features from training data for selection purposes. The methods used include MRA, AOA, JS, ALO, GWO, WOA, HBO and EO. The Feature selection process includes: Binary Encoding (each candidate solution is represented as a binary array where “1” indicates the feature is selected and “0” indicates exclusion), Fitness Evaluation (the f_measure of each candidate is computed to assess the quality of the selected featu re subset) and Data Splitting (the dataset is segmented into training, validation, and testing sets). The training set produces candidate solutions while the validation set determines convergence, and the testing set provides final evaluation.

After optimal feature subset identification, SVM with an RBF kernel was used for classification, ensuring consistent performance comparison before and after feature selection. Among the optimization methods evaluated (metrics: accuracy, F-measure, geometric mean, sensitivity, specificity, precision), GWO and HBO showed superior performance, with GWO achieving 94.5% accuracy and strong F-measure and geometric mean scores.

5. Comparative analysis and discussion

Tables 3 and 4 present the per formance of two distinct strategies for student performance prediction: Boosting-Based Models (Experiment 2) and Modern Optimization Methods (Experiment 3).

Boosting-Based Models (Exper iment 2), particularly the GBC, excel in overall predictive accuracy and have been extensively validated using crossvalidation. However, their computational efficiency varies, with AdaBoost offering a faster training time at a marginal cost to accuracy.

Experiment 3 (Modern optimization methods) focused on using optimization methods for feature selection, aiming to identify the most suitable feature subset from student datasets to enhance prediction capabilities. This process involved comparing the proposed method against contemporary optimization algorithms (MRA, AOA, JS, ALO, GWO, WOA, HBO, EO); implementing binary encoding for feature selection, where “ones” denote included features and “ zeros” represent excluded ones; evaluating individual fitness within optimization methods using the f_measure; segmenting the dataset into training, valida tion, and testing subsets, with optimization methods searching for the best feature subset on the training set and validating on the validation set until termination criteria are met; classifying the identified optimal subset using an SVM with an RBF kernel to ensure fair comparison of performance gains post-feature selection.

Table 3. Comparison of the outcome of different optimization algorithms

MethodAccuracyF_MeasureGmeanGerman_FeaturesOr74.670.7375.79360MRA91.2591.3291.25177AOA60.562.0260.37255JS6265.7761.02275ALO93.593.3393.47181GWO94.594.2794.42159WOA89.2589.4989.22266HBO9493.7893.93186EO91.2591.2391.25188

Table 4. Comparison of outcome of different optimization algorithms based on sensitivity, specificity and precision

MethodSensitivitySpecificityPrecisionOr76.9274.670.36MRA9290.590.64AOA64.556.559.72JS735159.84ALO919695.79GWO90.598.598.37WOA91.58787.56HBO90.597.597.31EO9191.591.46

The optimization methods were assessed based on accuracy, F-measure, geometric mean, sensitivity, specificity, and precision. The Grey Wolf Optimizer (GWO) and Heap-Based Optimizer (HBO) demonstrated superior performance in identifying optimal feature subsets, significantly boost ing model performance. GWO achieved 94.5% accuracy along with exceptional F-measure (94.27) and g eometric mean (94.42) scores. HBO reached 94% accuracy with outstanding F-measure and geometric mean values. Both methods also effectively minimized false positives, with HBO showing 90.5% sensitivity and 97.5% specificity, and GWO demonstrating 98.5% specificity.

This research highlights that while boosting algorithms like Gradient Boosting and AdaBoost offer high accuracy and precision for overall prediction, optimization methods like GWO and HBO provide unique approaches to feature selection that significantly enhance model performance across different scenarios. The integration of a unified classifier (SVM with an RBF kernel) post-feature selection ensures a fair comparison of performance gains attributable to feature optimization. Ultimately, the choice of an appropriate method depends on the specific task requirements and performance objectives.

This study systematically evaluated machine learning and optimization strategies across three experiments to predict student achievement. Experiment 1 assessed baseline machine learning algorithms (LR, DT, RF, Multi-Layer Perceptron), revealing 12th-grade percentage, CGPA, and gender as key performance predictors through cross-validation and feature importance analysis. Experiment 2 utilized boosting algorithms (GBC, XGBoost, LightGBM). GBC achieved particularly impressive classification metrics, while XGBoost and tuned LightGBM also demonstrated excellent accuracy and F1 scores, underscoring the importance of task-specific boosting method selection. Experiment 3 applied modern optimization methods for feature selection, notably GWO and HBO, which proved highly effective in identifying cr ucial features. These methods delivered superior accuracy and F-measure when combined with an SVM (RBF kernel) classifier after f eature selection, si gnificantly enhancing model predictions. Overall, the experiments demonstrate the successful application of both machine learning a lgorithms and optimization techniques. While boosting algorithms deliver enhanced predictive accuracy, optimization-based feature selection provides additional gains by identifying the most influential features. Based on experimental findings, the most optimal strategy to predict student performance is a combina tion of boosting algorithms for precise predictions and optimization-based feature selection methods. This integrated approach empowers researchers to build stronger, more accurate predictive models for student outcomes.

6. Implications for Administrative Practice and

Institutional Policy

The XGBoost and GWO-enhanced predictive models are valuable strategic tools for hig her education. They enable a shift from reactive to proactive, data-driven student support, improving retention and institutional effectiveness.

The models function as a data-driven triage system, accurately identifying "at-risk" students and a llowing for the proactive alloca tion of limited resources like advising and tuto ring. This system cir cumvents the issue of students being unlikely to seek help themselves. By using objective data, the models also promote equity, flagging struggling students based on need rather than social or cultural factors.

Beyond individual student support, these models act as a diagnostic tool for the ins titution. Aggregated “ at-risk” data can rev eal systemic issues, like “hot spots” in specific courses or programs. This empirical evidence allows administrators to make fact-based decisions about curricular reform, faculty development, and policy changes, fostering a dynamic feedback loop for continuous improvement.

Finally, the models serve as a catalyst for coordinated intervention. An "at-risk" flag can trigger a multi-departmental workflow, breaking down institutional silos. This creates a holistic, wraparound support network where academic advisors, financial aid officers, and student services can work together to address a student's needs simultaneously, creating a more responsive and adequate infrastructure for student success.

7. Conclusion

Hybrid machine learning solutions are crucial for accurate student performance prediction. Our research, spanning multiple experimental phases, found that XGBoost and similar boosting algorithms deliver superior accuracy, further optimized by intellig ent feature selection techniques like GWO and HBO for effective data dimension reduction. The consistent use of SVM as a classifier facilitated fair performance comparisons. These findings offer practical insights for educational institutions to identify and support vulnerable students proactively. Future predictive model advancements will require integrating real-time student data and ensemblebased optimization to foster data-driven education.

Acknowledgements

This paper is financed by the European Union-NextGenerationEU, through the National Recovery and Resilience Plan of the Republic of Bulgaria, project BG-RRP-2.004-0001-C01.

REFERENCES

Ajibade, S. S. M., Ahmad, N. B., & Shamsuddin, S. (2019). An heuristic feature selection algorithm to evaluate academic performance of students. In 2019 IEEE 10th Control and System Graduate Research Colloquium, 110 – 114.

Apriyadi, M., & Rini, D. (2023). Hyperparameter optimization of support vector regression algorithm using metaheuristic algorithm for student performance prediction. International Journal of Advanced Computer Science and Applications, 14(2), 144 – 150.

Chen, J., Hsieh, H., & Do, Q. (2014). Predicting student academic performance: A comparison of two meta-heuristic algorithms inspired by cuckoo birds for training neural networks. Algorithms, 7(4), 538 – 553.

Cheng, B., Liu, Y., & Jia, Y. (2024). Evaluation of students' performance during the academic period using the XG-Boost Classifier-Enhanced AEO hybrid model. Expert Systems with Applications, 238, 122136.

Farissi, A., et al. (2022). High Accuracy Feature Selection Using Metaheuristic Algorithm for Classification of Student Academic Performance Prediction. Int. Conf. of Advanced Computing and Informatics, Cham: Springer International Publishing, 399 – 409.

Goran, R., et al. (2024). Identifying and understanding student dropouts using metaheuristic optimized classifiers and explainable artificial intelligence techniques. IEEE Access, 12, 122377 – 122400.

Hai, Q., & Wang, C. (2024). Optimizing Student Performance Prediction: A Data Mining Approach with MLPC Model and Meta heuristic Algorithm. International Journal of Advanced Computer Sci ence & Applications, 15(4), 55 – 71.

Hasheminejad, S., & Sarvmili, M. (2019). S3PSO: Students’ performance prediction based on particle swarm optimization. Journal of AI and Data Mining, 7(1), 77 – 96.

Hussain, K., Talpur, N., & Afta b, M. (2020). A Novel Metaheuristic Approach to Optimization of Neuro-Fuzzy System for Students' Performance Prediction. Journal of Soft Computing and Data Mining, 1(1), 1 – 9.

Kukkar, A., Mo hana, R., Sharma, A., & Nayya r, A. (2024). A novel methodology using RNN+ LSTM+ ML for predicting student’s academic performance. Education and Information Technologies, 1 – 37.

Kukkar, A., Sharma, A., Singh, P. K., & Kumar, Y. (2023). Predicting Students Final Academic Performance Using Deep Learning Techniques. In IoT, Big Data and AI for Improving Quality of Everyday Life: Present and Future Challenges: IOT, Data Science and Artificial Intelligence Technologies, Cham: Springer International Publishing, 219 – 241.

Li, Y., & He, M. (2024). Elevating Student Performance Prediction using Extra-Trees Classifier and Meta-Heuristic Optimization Algorithms. International Journal of Advanced Computer Science & Applications, 15(2), 434 – 450.

Ma, C. (2024). Improving the Prediction of Student Performance by Integrating a Random Forest Classifier with Meta-Heuristic Optimization Algorithms. International Journal of Advanced Computer Science & Applications, 15(6), 1032 – 1044.

Punitha, S., & Devaki, K. (2024). A high ranking-based ensemble network for student’ s performance prediction u sing improved metaheuristic-aided feature selection and adaptive GAN for recommender system. Kybernetes. https://doi.org/10.1108/K-03-2024-0824

Ren, Z., & He, M. (2024). Meta-Model Classification Based on the Naïve Bias Technique Auto-Regulated via Novel Metaheuristic Methods to Define Optimal Attributes of Student Performance. IJACSA, 15(1), 1049 – 1060.

Shekhar, S., Kartikey, K., & Arya, A. (2020). Integrating decision trees with metaheuristic search optimization algorithm for a student’ s performance prediction. 2020 IEEE SSCI, 655 – 661.

Song, X. (2024). Student performan ce prediction employing k-Nearest Neighbor Classification model and meta-heuristic algorithms. Multiscale and Multidisciplinary Modelling, Experiments and Design, 1 – 16.

Thaher, T., et al. (2021). An enhanced evolutionary student performance prediction model using whale optimization algorithm boosted with sine-cosine mechanism. Applied Sciences, 11(21), 10237.

Turabieh, H., et al. (202 1). Enhanced Harris Hawks optimi zation as a feature selection for the prediction of student performance. Computing, 103(7), 1417 – 1438.

Xu, H., & Kim, M. (2024). Combination prediction method of students’ performance based on ant colony algorithm. Plos one, 19(3), e0300010.

2025 година
Книжка 6
ENHANCING STUDENT MOTIVATION AND ACHIEVEMENT THROUGH DIGITAL MIND MAPPING

Mikloš Kovač, Mirjana Brdar, Goran Radojev, Radivoje Stojković

OPTIMIZATION VS BOOSTING: COMPARISON OF STRATEGIES ON EDUCATIONAL DATASETS TO EXPLORE LOW-PERFORMING AT-RISK AND DROPOUT STUDENTS

Ranjit Paul, Asmaa Mohamed, Peren Jerfi Canatalay, Ashima Kukkar, Sadiq Hussain, Arun K. Baruah, Jiten Hazarika, Silvia Gaftandzhieva, Esraa A. Mahareek, Abeer S. Desuky, Rositsa Doneva

ARTIFICIAL INTELLIGENCE AS A TOOL FOR PEDAGOGICAL INNOVATIONS IN MATHEMATICS EDUCATION

Stanka Hadzhikoleva, Maria Borisova, , Borislava Kirilova

Книжка 4
Книжка 3
МОДЕЛИ НА ВЕРОЯТНОСТНИ ПРОСТРАНСТВА В ОЛИМПИАДНИ ЗАДАЧИ

Драгомир Грозев, Станислав Харизанов

Книжка 1
A NOTE ON A GENERALIZED DYNAMICAL SYSTEM OCCURS IN MODELLING “THE BATTLE OF THE SEXES”: CHAOS IN SOCIOBIOLOGY

Nikolay Kyurkchiev, Anton Iliev, Vesselin Kyurkchiev, Angel Golev, Todorka Terzieva, Asen Rahnev

EDUCATIONAL RESOURCES FOR STUDYING MIDSEGMENTS OF TRIANGLE AND TRAPEZOID

Toni Chehlarova1), Neda Chehlarova2), Georgi Gachev

2024 година
Книжка 6
ВЪЗМОЖНОСТИ ЗА ИЗГРАЖДАНЕ НА МЕЖДУПРЕДМЕТНИ ВРЪЗКИ МАТЕМАТИКА – ИНФОРМАТИКА

Елена Каращранова, Ирена Атанасова, Надежда Борисова

Книжка 5
FRAMEWORK FOR DESIGNING VISUALLY ORIENTATED TOOLS TO SUPPORT PROJECT MANAGEMENT

Dalibor Milev, Nadezhda Borisova, Elena Karashtranova

3D ОБРАЗОВАТЕЛЕН ПОДХОД В ОБУЧЕНИЕТО ПО СТЕРЕОМЕТРИЯ

Пеньо Лебамовски, Марияна Николова

Книжка 4
DYNAMICS OF A NEW CLASS OF OSCILLATORS: MELNIKOV’S APPROACH, POSSIBLE APPLICATION TO ANTENNA ARRAY THEORY

Nikolay Kyurkchiev, Tsvetelin Zaevski, Anton Iliev, Vesselin Kyurkchiev, Asen Rahnev

Книжка 3
РАЗСТОЯНИЯ МЕЖДУ ЗАБЕЛЕЖИТЕЛНИ ТОЧКИ И НЕРАВЕНСТВА В ИЗПЪКНАЛ ЧЕТИРИЪГЪЛНИК

Йордан Табов, Станислав Стефанов, Красимир Кънчев, Хаим Хаимов

USING AI TO IMPROVE ANSWER EVALUATION IN AUTOMATED EXAMS

Georgi Cholakov, Asya Stoyanova-Doycheva

Книжка 2
ON INTEGRATION OF STEM MODULES IN MATHEMATICS EDUCATION

Elena Karashtranova, Aharon Goldreich, Nadezhda Borisova

Книжка 1
STUDENT SATISFACTION WITH THE QUALITY OF A BLENDED LEARNING COURSE

Silvia Gaftandzhieva, Rositsa Doneva, Sadiq Hussain, Ashis Talukder, Gunadeep Chetia, Nisha Gohain

MODERN ROAD SAFETY TRAINING USING GAME-BASED TOOLS

Stefan Stavrev, Ivelina Velcheva

ARTIFICIAL INTELLIGENCE FOR GOOD AND BAD IN CYBER AND INFORMATION SECURITY

Nikolay Kasakliev, Elena Somova, Margarita Gocheva

2023 година
Книжка 6
QUALITY OF BLENDED LEARNING COURSES: STUDENTS’ PERSPECTIVE

Silvia Gaftandzhieva, Rositsa Doneva, Sadiq Hussain, Ashis Talukder, Gunadeep Chetia, Nisha Gohain

МОДЕЛ НА ЛЕОНТИЕВ С MS EXCEL

Велика Кунева, Мариян Милев

Книжка 5
AREAS ASSOCIATED TO A QUADRILATERAL

Oleg Mushkarov, Nikolai Nikolov

ON THE DYNAMICS OF A ClASS OF THIRD-ORDER POLYNOMIAL DIFFERENCE EQUATIONS WITH INFINITE NUMBER OF PERIOD-THREE SOLUTIONS

Jasmin Bektešević, Vahidin Hadžiabdić, Midhat Mehuljić, Sadjit Metović, Haris Lulić

СИСТЕМА ЗА ИЗВЛИЧАНЕ И ВИЗУАЛИЗАЦИЯ НА ДАННИ ОТ ИНТЕРНЕТ

Георги Чолаков, Емил Дойчев, Светла Коева

Книжка 4
MULTIPLE REPRESENTATIONS OF FUNCTIONS IN THE FRAME OF DISTANCE LEARNING

Radoslav Božić, Hajnalka Peics, Aleksandar Milenković

INTEGRATED LESSONS IN CALCULUS USING SOFTWARE

Pohoriliak Oleksandr, Olga Syniavska, Anna Slyvka-Tylyshchak, Antonina Tegza, Alexander Tylyshchak

Книжка 3
ПРИЛОЖЕНИЕ НА ЕЛЕМЕНТИ ОТ ГЕОМЕТРИЯТА НА ЧЕТИРИЪГЪЛНИКА ЗА РЕШАВАНЕ НА НЕСТАНДАРТНИ ЗАДАЧИ

Йордан Табов, Веселин Ненков, Асен Велчев, Станислав Стефанов

Книжка 2
Книжка 1
НОВА ФОРМУЛА ЗА ЛИЦЕ НА ЧЕТИРИЪГЪЛНИК (ЧЕТИВО ЗА VII КЛАС)

Йордан Табов, Асен Велчев, Станислав Стефанов, Хаим Хаимов

2022 година
Книжка 6
MOBILE GAME-BASED MATH LEARNING FOR PRIMARY SCHOOL

Margarita Gocheva, Nikolay Kasakliev, Elena Somova

Книжка 5
SECURITY ANALYSIS ON CONTENT MANAGEMENT SYSTEMS

Lilyana Petkova, Vasilisa Pavlova

MONITORING OF STUDENT ENROLMENT CAMPAIGN THROUGH DATA ANALYTICS TOOLS

Silvia Gaftandzhieva, Rositsa Doneva, Milen Bliznakov

TYPES OF SOLUTIONS IN THE DIDACTIC GAME “LOGIC MONSTERS”

Nataliya Hristova Pavlova, Michaela Savova Toncheva

Книжка 4
PERSONAL DATA PROCESSING IN A DIGITAL EDUCATIONAL ENVIRONMENT

Evgeniya Nikolova, Mariya Monova-Zheleva, Yanislav Zhelev

Книжка 3
Книжка 2
STEM ROBOTICS IN PRIMARY SCHOOL

Tsanko Mihov, Gencho Stoitsov, Ivan Dimitrov

A METAGRAPH MODEL OF CYBER PROTECTION OF AN INFORMATION SYSTEM

Emiliya Koleva, Evgeni Andreev, Mariya Nikolova

Книжка 1
CONVOLUTIONAL NEURAL NETWORKS IN THE TASK OF IMAGE CLASSIFICATION

Larisa Zelenina, Liudmila Khaimina, Evgenii Khaimin, D. Khripunov, Inga Zashikhina

INNOVATIVE PROPOSALS FOR DATABASE STORAGE AND MANAGEMENT

Yulian Ivanov Petkov, Alexandre Ivanov Chikalanov

APPLICATION OF MATHEMATICAL MODELS IN GRAPHIC DESIGN

Ivaylo Staribratov, Nikol Manolova

РЕШЕНИЯ НА КОНКУРСНИ ЗАДАЧИ БРОЙ 6, 2021 Г.

Задача 1. Дадени са различни естествени числа, всяко от които има прос- ти делители, не по-големи от . Докажете, че произведението на някои три от тези числа е точен куб. Решение: числата са представим във вида . Нека разгледаме квадрат

2021 година
Книжка 6
E-LEARNING DURING COVID-19 PANDEMIC: AN EMPIRICAL RESEARCH

Margarita Gocheva, Nikolay Kasakliev, Elena Somova

Книжка 5
ПОДГОТОВКА ЗА XXV МЛАДЕЖКА БАЛКАНИАДА ПО МАТЕМАТИКА 2021

Ивайло Кортезов, Емил Карлов, Мирослав Маринов

EXCEL’S CALCULATION OF BASIC ASSETS AMORTISATION VALUES

Vehbi Ramaj, Sead Rešić, Anes Z. Hadžiomerović

EDUCATIONAL ENVIRONMENT AS A FORM FOR DEVELOPMENT OF MATH TEACHERS METHODOLOGICAL COMPETENCE

Olha Matiash, Liubov Mykhailenko, Vasyl Shvets, Oleksandr Shkolnyi

Книжка 4
LEARNING ANALYTICS TOOL FOR BULGARIAN SCHOOL EDUCATION

Silvia Gaftandzhieva, Rositsa Doneva, George Pashev, Mariya Docheva

Книжка 3
THE PROBLEM OF IMAGES’ CLASSIFICATION: NEURAL NETWORKS

Larisa Zelenina, Liudmila Khaimina, Evgenii Khaimin, D. Khripunov, Inga Zashikhina

MIDLINES OF QUADRILATERAL

Sead Rešić, Maid Omerović, Anes Z. Hadžiomerović, Ahmed Palić

ВИРТУАЛЕН ЧАС ПО МАТЕМАТИКА

Севдалина Георгиева

Книжка 2
MOBILE MATH GAME PROTOTYPE ON THE BASE OF TEMPLATES FOR PRIMARY SCHOOL

Margarita Gocheva, Elena Somova, Nikolay Kasakliev, Vladimira Angelova

КОНКУРСНИ ЗАДАЧИ БРОЙ 2/2021 Г.

Краен срок за изпращане на решения: 0 юни 0 г.

РЕШЕНИЯ НА ЗАДАЧИТЕ ОТ БРОЙ 1, 2021

Краен срок за изпращане на решения: 0 юни 0 г.

Книжка 1
СЕДЕМНАДЕСЕТА ЖАУТИКОВСКА ОЛИМПИАДА ПО МАТЕМАТИКА, ИНФОРМАТИКА И ФИЗИКА АЛМАТИ, 7-12 ЯНУАРИ 2021

Диян Димитров, Светлин Лалов, Стефан Хаджистойков, Елена Киселова

ОНЛАЙН СЪСТЕЗАНИЕ „VIVA МАТЕМАТИКА С КОМПЮТЪР“

Петър Кендеров, Тони Чехларова, Георги Гачев

2020 година
Книжка 6
ABSTRACT DATA TYPES

Lasko M. Laskov

Книжка 5
GAMIFICATION IN CLOUD-BASED COLLABORATIVE LEARNING

Denitza Charkova, Elena Somova, Maria Gachkova

NEURAL NETWORKS IN A CHARACTER RECOGNITION MOBILE APPLICATION

L.I. Zelenina, L.E. Khaimina, E.S. Khaimin, D.I. Antufiev, I.M. Zashikhina

APPLICATIONS OF ANAGLIFIC IMAGES IN MATHEMATICAL TRAINING

Krasimir Harizanov, Stanislava Ivanova

МЕТОД НА ДЕЦАТА В БЛОКА

Ивайло Кортезов

Книжка 4
TECHNOLOGIES AND TOOLS FOR CREATING ADAPTIVE E-LEARNING CONTENT

Todorka Terzieva, Valya Arnaudova, Asen Rahnev, Vanya Ivanova

Книжка 3
MATHEMATICAL MODELLING IN LEARNING OUTCOMES ASSESSMENT (BINARY MODEL FOR THE ASSESSMMENT OF STUDENT’S COMPETENCES FORMATION)

L. E. Khaimina, E. A. Demenkova, M. E. Demenkov, E. S. Khaimin, L. I. Zelenina, I. M. Zashikhina

PROBLEMS 2 AND 5 ON THE IMO’2019 PAPER

Sava Grozdev, Veselin Nenkov

Книжка 2
ЗА ВЕКТОРНОТО ПРОСТРАНСТВО НА МАГИЧЕСКИТЕ КВАДРАТИ ОТ ТРЕТИ РЕД (В ЗАНИМАТЕЛНАТА МАТЕМАТИКА)

Здравко Лалчев, Маргарита Върбанова, Мирослав Стоимиров, Ирина Вутова

КОНКУРЕНТНИ ПЕРПЕНДИКУЛЯРИ, ОПРЕДЕЛЕНИ ОТ ПРАВИЛНИ МНОГОЪГЪЛНИЦИ

Йоана Христова, Геновева Маринова, Никола Кушев, Светослав Апостолов, Цветомир Иванов

A NEW PROOF OF THE FEUERBACH THEOREM

Sava Grozdev, Hiroshi Okumura, Deko Dekov

PROBLEM 3 ON THE IMO’2019 PAPER

Sava Grozdev, Veselin Nenkov

Книжка 1
GENDER ISSUES IN VIRTUAL TRAINING FOR MATHEMATICAL KANGAROO CONTEST

Mark Applebaum, Erga Heller, Lior Solomovich, Judith Zamir

KLAMKIN’S INEQUALITY AND ITS APPLICATION

Šefket Arslanagić, Daniela Zubović

НЯКОЛКО ПРИЛОЖЕНИЯ НА ВЪРТЯЩАТА ХОМОТЕТИЯ

Сава Гроздев, Веселин Ненков

2019 година
Книжка 6
DISCRETE MATHEMATICS AND PROGRAMMING – TEACHING AND LEARNING APPROACHES

Mariyana Raykova, Hristina Kostadinova, Stoyan Boev

CONVERTER FROM MOODLE LESSONS TO INTERACTIVE EPUB EBOOKS

Martin Takev, Elena Somova, Miguel Rodríguez-Artacho

ЦИКЛОИДА

Аяпбергенов Азамат, Бокаева Молдир, Чурымбаев Бекнур, Калдыбек Жансуйген

КАРДИОИДА

Евгений Воронцов, Никита Платонов

БОЛГАРСКАЯ ОЛИМПИАДА ПО ФИНАНСОВОЙ И АКТУАРНОЙ МАТЕМАТИКЕ В РОССИИ

Росен Николаев, Сава Гроздев, Богдана Конева, Нина Патронова, Мария Шабанова

КОНКУРСНИ ЗАДАЧИ НА БРОЯ

Задача 1. Да се намерят всички полиноми, които за всяка реална стойност на удовлетворяват равенството Татяна Маджарова, Варна Задача 2. Правоъгълният триъгълник има остри ъгли и , а центърът на вписаната му окръжност е . Точката , лежаща в , е такава, че и . Симетралите

РЕШЕНИЯ НА ЗАДАЧИТЕ ОТ БРОЙ 1, 2019

Задача 1. Да се намерят всички цели числа , за които

Книжка 5
ДЪЛБОКО КОПИЕ В C++ И JAVA

Христина Костадинова, Марияна Райкова

КОНКУРСНИ ЗАДАЧИ НА БРОЯ

Задача 1. Да се намери безкрайно множество от двойки положителни ра- ционални числа Милен Найденов, Варна

РЕШЕНИЯ НА ЗАДАЧИТЕ ОТ БРОЙ 6, 2018

Задача 1. Точката е левият долен връх на безкрайна шахматна дъска. Една муха тръгва от и се движи само по страните на квадратчетата. Нека е общ връх на някои квадратчета. Казва- ме, че мухата изминава пътя между и , ако се движи само надясно и нагоре. Ако точките и са противоположни върхове на правоъгълник , да се намери броят на пътищата, свърз- ващи точките и , по които мухата може да мине, когато: а) и ; б) и ; в) и

Книжка 4
THE REARRANGEMENT INEQUALITY

Šefket Arslanagić

АСТРОИДА

Борислав Борисов, Деян Димитров, Николай Нинов, Теодор Христов

COMPUTER PROGRAMMING IN MATHEMATICS EDUCATION

Marin Marinov, Lasko Laskov

CREATING INTERACTIVE AND TRACEABLE EPUB LEARNING CONTENT FROM MOODLE COURSES

Martin Takev, Miguel Rodríguez-Artacho, Elena Somova

КОНКУРСНИ ЗАДАЧИ НА БРОЯ

Задача 1. Да се реши уравнението . Христо Лесов, Казанлък Задача 2. Да се докаже, че в четириъгълник с перпендикулярни диагонали съществува точка , за която са изпълнени равенствата , , , . Хаим Хаимов, Варна Задача 3. В правилен 13-ъгълник по произволен начин са избрани два диа- гонала. Каква е вероятността избраните диагонали да не се пресичат? Сава Гроздев, София, и Веселин Ненков, Бели Осъм

РЕШЕНИЯ НА ЗАДАЧИТЕ ОТ БРОЙ 5, 2018

Задача 1. Ако и са съвършени числа, за които целите части на числата и са равни и различни от нула, да се намери .

Книжка 3
RESULTS OF THE FIRST WEEK OF CYBERSECURITY IN ARKHANGELSK REGION

Olga Troitskaya, Olga Bezumova, Elena Lytkina, Tatyana Shirikova

DIDACTIC POTENTIAL OF REMOTE CONTESTS IN COMPUTER SCIENCE

Natalia Sofronova, Anatoliy Belchusov

КОНКУРСНИ ЗАДАЧИ НА БРОЯ

Краен срок за изпращане на решения 30 ноември 2019 г.

РЕШЕНИЯ НА ЗАДАЧИТЕ ОТ БРОЙ 4, 2018

Задача 1. Да се намерят всички тройки естествени числа е изпълнено равенството: а)

Книжка 2
ЕЛЕКТРОНЕН УЧЕБНИК ПО ОБЗОРНИ ЛЕКЦИИ ЗА ДЪРЖАВЕН ИЗПИТ В СРЕДАТА DISPEL

Асен Рахнев, Боян Златанов, Евгения Ангелова, Ивайло Старибратов, Валя Арнаудова, Слав Чолаков

ГЕОМЕТРИЧНИ МЕСТА, ПОРОДЕНИ ОТ РАВНОСТРАННИ ТРИЪГЪЛНИЦИ С ВЪРХОВЕ ВЪРХУ ОКРЪЖНОСТ

Борислав Борисов, Деян Димитров, Николай Нинов, Теодор Христов

ЕКСТРЕМАЛНИ СВОЙСТВА НА ТОЧКАТА НА ЛЕМОАН В ЧЕТИРИЪГЪЛНИК

Веселин Ненков, Станислав Стефанов, Хаим Хаимов

A TRIANGLE AND A TRAPEZOID WITH A COMMON CONIC

Sava Grozdev, Veselin Nenkov

КОНКУРСНИ ЗАДАЧИ НА БРОЯ

Христо Лесов, Казанлък Задача 2. Окръжност с диаметър и правоъгълник с диагонал имат общ център. Да се докаже, че за произволна точка M от е изпълне- но равенството . Милен Найденов, Варна Задача 3. В изпъкналия четириъгълник са изпълнени равенства- та и . Точката е средата на диагонала , а , , и са ортоганалните проекции на съответно върху правите , , и . Ако и са средите съответно на отсечките и , да се докаже, че точките , и лежат на една права.

РЕШЕНИЯ НА ЗАДАЧИТЕ ОТ БРОЙ 3, 2018

Задача 1. Да се реши уравнението . Росен Николаев, Дико Суружон, Варна Решение. Въвеждаме означението , където . Съгласно това означение разлежданото уравнение придобива вида не е решение на уравнението. Затова са възможни само случаите 1) и 2) . Разглеж- даме двата случая поотделно. Случай 1): при е изпълнено равенството . Тогава имаме:

Книжка 1
PROBLEM 6. FROM IMO’2018

Sava Grozdev, Veselin Nenkov

РЕШЕНИЯ НА ЗАДАЧИТЕ ОТ БРОЙ 2, 2018

Задача 1. Да се намери най-малкото естествено число , при което куба с целочислени дължини на ръбовете в сантиметри имат сума на обемите, рав- на на Христо Лесов, Казанлък Решение: тъй като , то не е куб на ес- тествено число и затова . Разглеждаме последователно случаите за . 1) При разглеждаме естествени числа и , за които са изпълнени релациите и . Тогава то , т.е. . Освен това откъдето , т.е. .Така получихме, че . Лесно се проверява, че при и няма естествен

КОНКУРСНИ ЗАДАЧИ НА БРОЯ

Задача 1. Да се намерят всички цели числа , за които

2018 година
Книжка 6
„ЭНЦИКЛОПЕДИЯ ЗАМЕЧАТЕЛЬНЫХ ПЛОСКИХ КРИВЫХ“ – МЕЖДУНАРОДНЫЙ СЕТЕВОЙ ИССЛЕДОВАТЕЛЬСКИЙ ПРОЕКТ В РАМКАХ MITE

Роза Атамуратова, Михаил Алфёров, Марина Белорукова, Веселин Ненков, Валерий Майер, Генадий Клековкин, Раиса Овчинникова, Мария Шабанова, Александр Ястребов

A NEW MEANING OF THE NOTION “EXPANSION OF A NUMBER”

Rosen Nikolaev, Tanka Milkova, Radan Miryanov

Книжка 5
ИТОГИ ПРОВЕДЕНИЯ ВТОРОЙ МЕЖДУНАРОДНОЙ ОЛИМПИАДЬI ПО ФИНАНСОВОЙ И АКТУАРНОЙ МАТЕМАТИКЕ СРЕДИ ШКОЛЬНИКОВ И СТУДЕНТОВ

Сава Гроздев, Росен Николаев, Мария Шабанова, Лариса Форкунова, Нина Патронова

LEARNING AND ASSESSMENT BASED ON GAMIFIED E-COURSE IN MOODLE

Mariya Gachkova, Martin Takev, Elena Somova

УЛИТКА ПАСКАЛЯ

Дарья Коптева, Ксения Горская

КОМБИНАТОРНИ ЗАДАЧИ, СВЪРЗАНИ С ТРИЪГЪЛНИК

Росен Николаев, Танка Милкова, Катя Чалъкова

Книжка 4
ЗА ПРОСТИТЕ ЧИСЛА

Сава Гроздев, Веселин Ненков

ИНЦЕНТЪР НА ЧЕТИРИЪГЪЛНИК

Станислав Стефанов

ЭПИЦИКЛОИДА

Инкар Аскар, Камила Сарсембаева

ГИПОЦИКЛОИДА

Борислав Борисов, Деян Димитров, Иван Стефанов, Николай Нинов, Теодор Христов

Книжка 3
ПОЛИНОМИ ОТ ТРЕТА СТЕПЕН С КОЛИНЕАРНИ КОРЕНИ

Сава Гроздев, Веселин Ненков

ЧЕТИРИДЕСЕТ И ПЕТА НАЦИОНАЛНА СТУДЕНТСКА ОЛИМПИАДА ПО МАТЕМАТИКА

Сава Гроздев, Росен Николаев, Станислава Стоилова, Веселин Ненков

Книжка 2
TWO INTERESTING INEQUALITIES FOR ACUTE TRIANGLES

Šefket Arslanagić, Amar Bašić

ПЕРФЕКТНА ИЗОГОНАЛНОСТ В ЧЕТИРИЪГЪЛНИК

Веселин Ненков, Станислав Стефанов, Хаим Хаимов

НЯКОИ ТИПОВЕ ЗАДАЧИ СЪС СИМЕТРИЧНИ ЧИСЛА

Росен Николаев, Танка Милкова, Радан Мирянов

Книжка 1
Драги читатели,

където тези проценти са наполовина, в Източна Европа те са около 25%, в

COMPUTER DISCOVERED MATHEMATICS: CONSTRUCTIONS OF MALFATTI SQUARES

Sava Grozdev, Hiroshi Okumura, Deko Dekov

ВРЪЗКИ МЕЖДУ ЗАБЕЛЕЖИТЕЛНИ ТОЧКИ В ЧЕТИРИЪГЪЛНИКА

Станислав Стефанов, Веселин Ненков

КОНКУРСНИ ЗАДАЧИ НА БРОЯ

Задача 2. Да се докаже, че всяка от симедианите в триъгълник с лице разделя триъгълника на два триъгълника, лицата на които са корени на урав- нението където и са дължините на прилежащите на симедианата страни на три- ъгълника. Милен Найденов, Варна Задача 3. Четириъгълникът е описан около окръжност с център , като продълженията на страните му и се пресичат в точка . Ако е втората пресечна точка на описаните окръжности на триъгълниците и , да се докаже, че Хаим Х

РЕШЕНИЯ НА ЗАДАЧИТЕ ОТ БРОЙ 2, 2017

Задача 1. Да се определи дали съществуват естествени числа и , при които стойността на израза е: а) куб на естествено число; б) сбор от кубовете на две естествени числа; в) сбор от кубовете на три естествени числа. Христо Лесов, Казанлък Решение: при и имаме . Следова- телно случай а) има положителен отговор. Тъй като при число- то се дели на , то при и имаме е естестве- но число. Следователно всяко число от разглеждания вид при деление на дава ос

2017 година
Книжка 6
A SURVEY OF MATHEMATICS DISCOVERED BY COMPUTERS. PART 2

Sava Grozdev, Hiroshi Okumura, Deko Dekov

ТРИ ИНВАРИАНТЫ В ОДНУ ЗАДА

Ксения Горская, Дарья Коптева, Асхат Ермекбаев, Арман Жетиру, Азат Бермухамедов, Салтанат Кошер, Лили Стефанова, Ирина Христова, Александра Йовкова

GAMES WITH

Aldiyar Zhumashov

SOME NUMERICAL SQUARE ROOTS (PART TWO)

Rosen Nikolaev, Tanka Milkova, Yordan Petkov

ЗАНИМАТЕЛНИ ЗАДАЧИ ПО ТЕМАТА „КАРТИННА ГАЛЕРИЯ“

Мирослав Стоимиров, Ирина Вутова

Книжка 5
ВТОРОЙ МЕЖДУНАРОДНЫЙ СЕТЕВОЙ ИССЛЕДОВАТЕЛЬСКИЙ ПРОЕКТ УЧАЩИХСЯ В РАМКАХ MITE

Мария Шабанова, Марина Белорукова, Роза Атамуратова, Веселин Ненков

SOME NUMERICAL SEQUENCES CONCERNING SQUARE ROOTS (PART ONE)

Rosen Nikolaev, Tanka Milkova, Yordan Petkov

Книжка 4
ГЕНЕРАТОР НА ТЕСТОВЕ

Ангел Ангелов, Веселин Дзивев

INTERESTING PROOFS OF SOME ALGEBRAIC INEQUALITIES

Šefket Arslanagić, Faruk Zejnulahi

PROBLEMS ON THE BROCARD CIRCLE

Sava Grozdev, Hiroshi Okumura, Deko Dekov

ПРИЛОЖЕНИЕ НА ЛИНЕЙНАТА АЛГЕБРА В ИКОНОМИКАТА

Велика Кунева, Захаринка Ангелова

СКОРОСТТА НА СВЕТЛИНАТА

Сава Гроздев, Веселин Ненков

Книжка 3
НЯКОЛКО ПРИЛОЖЕНИЯ НА ТЕОРЕМАТА НА МЕНЕЛАЙ ЗА ВПИСАНИ ОКРЪЖНОСТИ

Александра Йовкова, Ирина Христова, Лили Стефанова

НАЦИОНАЛНА СТУДЕНТСКА ОЛИМПИАДА ПО МАТЕМАТИКА

Сава Гроздев, Росен Николаев, Веселин Ненков

СПОМЕН ЗА ПРОФЕСОР АНТОН ШОУРЕК

Александра Трифонова

Книжка 2
ИЗКУСТВЕНА ИМУННА СИСТЕМА

Йоанна Илиева, Селин Шемсиева, Светлана Вълчева, Сюзан Феимова

ВТОРИ КОЛЕДЕН ЛИНГВИСТИЧЕН ТУРНИР

Иван Держански, Веселин Златилов

Книжка 1
ГЕОМЕТРИЯ НА ЧЕТИРИЪГЪЛНИКА, ТОЧКА НА МИКЕЛ, ИНВЕРСНА ИЗОГОНАЛНОСТ

Веселин Ненков, Станислав Стефанов, Хаим Хаимов

2016 година
Книжка 6
ПЕРВЫЙ МЕЖДУНАРОДНЫЙ СЕТЕВОЙ ИССЛЕДОВАТЕЛЬСКИЙ ПРОЕКТ УЧАЩИХСЯ В РАМКАХ MITE

Мария Шабанова, Марина Белорукова, Роза Атамуратова, Веселин Ненков

НЕКОТОРЫЕ ТРАЕКТОРИИ, КОТОРЫЕ ОПРЕДЕЛЕНЫ РАВНОБЕДРЕННЫМИ ТРЕУГОЛЬНИКАМИ

Ксения Горская, Дарья Коптева, Даниил Микуров, Еркен Мудебаев, Казбек Мухамбетов, Адилбек Темирханов, Лили Стефанова, Ирина Христова, Радина Иванова

ПСЕВДОЦЕНТЪР И ОРТОЦЕНТЪР – ЗАБЕЛЕЖИТЕЛНИ ТОЧКИ В ЧЕТИРИЪГЪЛНИКА

Веселин Ненков, Станислав Стефанов, Хаим Хаимов

FUZZY LOGIC

Reinhard Magenreuter

GENETIC ALGORITHM

Reinhard Magenreuter

Книжка 5
NEURAL NETWORKS

Reinhard Magenreuter

Книжка 4
АКТИВНО, УЧАСТВАЩО НАБЛЮДЕНИЕ – ТИП ИНТЕРВЮ

Христо Христов, Христо Крушков

ХИПОТЕЗАТА В ОБУЧЕНИЕТО ПО МАТЕМАТИКА

Румяна Маврова, Пенка Рангелова, Елена Тодорова

Книжка 3
ОБОБЩЕНИЕ НА ТЕОРЕМАТА НА ЧЕЗАР КОШНИЦА

Сава Гроздев, Веселин Ненков

Книжка 2
ОЙЛЕР-ВЕН ДИАГРАМИ ИЛИ MZ-КАРТИ В НАЧАЛНАТА УЧИЛИЩНА МАТЕМАТИКА

Здравко Лалчев, Маргарита Върбанова, Ирина Вутова, Иван Душков

ОБВЪРЗВАНЕ НА ОБУЧЕНИЕТО ПО АЛГЕБРА И ГЕОМЕТРИЯ

Румяна Маврова, Пенка Рангелова

Книжка 1
STATIONARY NUMBERS

Smaiyl Makyshov

МЕЖДУНАРОДНА ЖАУТИКОВСКА ОЛИМПИАДА

Сава Гроздев, Веселин Ненков

2015 година
Книжка 6
Книжка 5
Книжка 4
Книжка 3
МОТИВАЦИОННИТЕ ЗАДАЧИ В ОБУЧЕНИЕТО ПО МАТЕМАТИКА

Румяна Маврова, Пенка Рангелова, Зара Данаилова-Стойнова

Книжка 2
САМОСТОЯТЕЛНО РЕШАВАНЕ НА ЗАДАЧИ С EXCEL

Пламен Пенев, Диана Стефанова

Книжка 1
ГЕОМЕТРИЧНА КОНСТРУКЦИЯ НА КРИВА НА ЧЕВА

Сава Гроздев, Веселин Ненков

2014 година
Книжка 6
КОНКУРЕНТНОСТ, ПОРОДЕНА ОТ ТАНГЕНТИ

Сава Гроздев, Веселин Ненков

Книжка 5
ИНФОРМАТИКА В ШКОЛАХ РОССИИ

С. А. Бешенков, Э. В. Миндзаева

ОЩЕ ЕВРИСТИКИ С EXCEL

Пламен Пенев

ДВА ПОДХОДА ЗА ИЗУЧАВАНЕ НА УРАВНЕНИЯ В НАЧАЛНАТА УЧИЛИЩНА МАТЕМАТИКА

Здравко Лалчев, Маргарита Върбанова, Ирина Вутова

Книжка 4
ОБУЧЕНИЕ В СТИЛ EDUTAINMENT С ИЗПОЛЗВАНЕ НА КОМПЮТЪРНА ГРАФИКА

Христо Крушков, Асен Рахнев, Мариана Крушкова

Книжка 3
ИНВЕРСИЯТА – МЕТОД В НАЧАЛНАТА УЧИЛИЩНА МАТЕМАТИКА

Здравко Лалчев, Маргарита Върбанова

СТИМУЛИРАНЕ НА ТВОРЧЕСКА АКТИВНОСТ ПРИ БИЛИНГВИ ЧРЕЗ ДИНАМИЧЕН СОФТУЕР

Сава Гроздев, Диана Стефанова, Калина Василева, Станислава Колева, Радка Тодорова

ПРОГРАМИРАНЕ НА ЧИСЛОВИ РЕДИЦИ

Ивайло Старибратов, Цветана Димитрова

Книжка 2
ФРАКТАЛЬНЫЕ МЕТО

Валерий Секованов, Елена Селезнева, Светлана Шляхтина

Книжка 1
ЕВРИСТИКА С EXCEL

Пламен Пенев

SOME INEQUALITIES IN THE TRIANGLE

Šefket Arslanagić

2013 година
Книжка 6
Книжка 5
МАТЕМАТИЧЕСКИЕ РЕГАТЬI

Александр Блинков

Книжка 4
Книжка 3
АКАДЕМИК ПЕТЪР КЕНДЕРОВ НА 70 ГОДИНИ

чл. кор. Юлиан Ревалски

ОБЛАЧНИ ТЕХНОЛОГИИ И ВЪЗМОЖНОСТИ ЗА ПРИЛОЖЕНИЕ В ОБРАЗОВАНИЕТО

Сава Гроздев, Иванка Марашева, Емил Делинов

СЪСТЕЗАТЕЛНИ ЗАДАЧИ ПО ИНФОРМАТИКА ЗА ГРУПА Е

Ивайло Старибратов, Цветана Димитрова

Книжка 2
ЕКСПЕРИМЕНТАЛНАТА МАТЕМАТИКА В УЧИЛИЩЕ

Сава Гроздев, Борислав Лазаров

МАТЕМАТИКА С КОМПЮТЪР

Сава Гроздев, Деко Деков

ЕЛИПТИЧЕН АРБЕЛОС

Пролет Лазарова

Книжка 1
ФРАГМЕНТИ ОТ ПАМЕТТА

Генчо Скордев

2012 година
Книжка 6
ДВЕ ДИДАКТИЧЕСКИ СТЪЛБИ

Сава Гроздев, Светлозар Дойчев

ТЕОРЕМА НА ПОНСЕЛЕ ЗА ЧЕТИРИЪГЪЛНИЦИ

Сава Гроздев, Веселин Ненков

ИЗЛИЧАНЕ НА ОБЕКТИВНИ ЗНАНИЯ ОТ ИНТЕРНЕТ

Ивайло Пенев, Пламен Пенев

Книжка 5
ДЕСЕТА МЕЖДУНАРОДНА ОЛИМПИАДА ПО ЛИНГВИСТИКА

д–р Иван А. Держански (ИМИ–БАН)

ТЕОРЕМА НА ВАН ОБЕЛ И ПРИЛОЖЕНИЯ

Тодорка Глушкова, Боян Златанов

МАТЕМАТИЧЕСКИ КЛУБ „СИГМА” В СВЕТЛИНАТА НА ПРОЕКТ УСПЕХ

Сава Гроздев, Иванка Марашева, Емил Делинов

I N M E M O R I A M

На 26 септември 2012 г. след продължително боледуване ни напусна проф. дпн Иван Ганчев Донев. Той е първият професор и първият доктор на науките в България по методика на обучението по математика. Роден е на 6 май 1935 г. в с. Страхилово, В. Търновско. След завършване на СУ “Св. Кл. Охридски” става учител по математика в гр. Свищов. Тук той организира първите кръжоци и със- тезания по математика. През 1960 г. Иван Ганчев печели конкурс за асистент в СУ и още през следващата година започ

Книжка 4
Книжка 3
СЛУЧАЙНО СЪРФИРАНЕ В ИНТЕРНЕТ

Евгения Стоименова

Книжка 2
SEEMOUS OLYMPIAD FOR UNIVERSITY STUDENTS

Sava Grozdev, Veselin Nenkov

EUROMATH SCIENTIFIC CONFERENCE

Sava Grozdev, Veselin Nenkov

FIVE WAYS TO SOLVE A PROBLEM FOR A TRIANGLE

Šefket Arslanagić, Dragoljub Milošević

ПРОПОРЦИИ

Валя Георгиева

ПЪТЕШЕСТВИЕ В СВЕТА НА КОМБИНАТОРИКАТА

Росица Керчева, Румяна Иванова

ПОЛЗОТВОРНА ПРОМЯНА

Ивайло Старибратов

Книжка 1
ЗА ЕЛЕКТРОННОТО ОБУЧЕНИЕ

Даниела Дурева (Тупарова)

МАТЕМАТИКАТА E ЗАБАВНА

Веселина Вълканова

СРАВНЯВАНЕ НА ИЗРАЗИ С КВАДРАТНИ КОРЕНИ

Гинка Бизова, Ваня Лалева