Description: Weighted KNN (K-Nearest Neighbors Weighted) is a variation of the K-nearest neighbors algorithm that assigns weights to neighbors based on their distance from the query point. Unlike traditional KNN, where each neighbor contributes equally to the prediction, in Weighted KNN, closer neighbors have a greater impact on the final decision. This approach allows the model to be more sensitive to the most relevant data, thereby improving prediction accuracy. Weighting can be done in various ways, with the most common being the inverse of distance, where closer neighbors receive a higher weight than those further away. This technique is particularly useful in situations where data may be imbalanced or when there is a desire to emphasize the influence of the nearest examples. Additionally, Weighted KNN can be adapted for different types of problems, whether in classification or regression tasks, making it a versatile tool in the field of machine learning. Its implementation is relatively straightforward and does not require assumptions about data distribution, making it accessible for a wide range of applications in data science and artificial intelligence.