Remove AWS Remove Cross Validation Remove K-nearest Neighbors
article thumbnail

Build a crop segmentation machine learning model with Planet data and Amazon SageMaker geospatial capabilities

AWS Machine Learning Blog

In late 2023, Planet announced a partnership with AWS to make its geospatial data available through Amazon SageMaker. In this analysis, we use a K-nearest neighbors (KNN) model to conduct crop segmentation, and we compare these results with ground truth imagery on an agricultural region.

article thumbnail

Identifying defense coverage schemes in NFL’s Next Gen Stats

AWS Machine Learning Blog

Quantitative evaluation We utilize 2018–2020 season data for model training and validation, and 2021 season data for model evaluation. We perform a five-fold cross-validation to select the best model during training, and perform hyperparameter optimization to select the best settings on multiple model architecture and training parameters.

ML 72
article thumbnail

Understanding and Building Machine Learning Models

Pickl AI

K-Nearest Neighbors), while others can handle large datasets efficiently (e.g., Cross-Validation: Instead of using a single train-test split, cross-validation involves dividing the data into multiple folds and training the model on each fold. Some algorithms work better with small datasets (e.g.,