About 185,000 results
Open links in new tab
  1. entropy — SciPy v1.17.0 Manual

    entropy has experimental support for Python Array API Standard compatible backends in addition to NumPy. Please consider testing these features by setting an environment variable …

  2. How to Compute Entropy using SciPy? - GeeksforGeeks

    Jul 23, 2025 · Entropy is a fundamental concept in measuring the uncertainty or randomness in a dataset. Entropy plays a very significant role in machine learning models such as decision trees, …

  3. Cross-entropy - Wikipedia

    In information theory, the cross-entropy between two probability distributions and , over the same underlying set of events, measures the average number of bits needed to identify an event drawn …

  4. log_loss — scikit-learn 1.8.0 documentation

    Log loss, aka logistic loss or cross-entropy loss. This is the loss function used in (multinomial) logistic regression and extensions of it such as neural networks, defined as the negative log-likelihood of a …

  5. Cross-Entropy Loss Function in Machine Learning: Enhancing Model ...

    Feb 27, 2026 · Cross-entropy is a popular loss function used in machine learning to measure the performance of a classification model. Namely, it measures the difference between the discovered …

  6. CrossEntropyLoss — PyTorch 2.11 documentation

    CrossEntropyLoss # class torch.nn.CrossEntropyLoss(weight=None, size_average=None, ignore_index=-100, reduce=None, reduction='mean', label_smoothing=0.0) [source] # This criterion …

  7. Differential entropy - Wikipedia

    Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy (a measure of average …

  8. How to Implement Softmax and Cross-Entropy in Python and PyTorch

    Apr 24, 2023 · Implementing Cross Entropy Loss using Python and Numpy Below we discuss the Implementation of Cross-Entropy Loss using Python and the Numpy Library. Import the Numpy …

  9. Categorical Cross-Entropy in Multi-Class Classification

    Nov 25, 2025 · Categorical Cross-Entropy is widely used as a loss function to measure how well a model predicts the correct class in multi-class classification problems. It measures the difference …

  10. torch.nn.functional.cross_entropy — PyTorch 2.11 documentation

    torch.nn.functional.cross_entropy # torch.nn.functional.cross_entropy(input, target, weight=None, size_average=None, ignore_index=-100, reduce=None, reduction='mean', label_smoothing=0.0) …