Command Palette
Search for a command to run...
Wiki
We have compiled hundreds of related entries to help you understand "artificial intelligence"
Search for a command to run...
We have compiled hundreds of related entries to help you understand "artificial intelligence"
Harmonic mean is a method of calculating averages, which can be divided into simple and weighted forms. The weighted harmonic mean is a variation of the weighted arithmetic mean. In most cases, we only know the sum of the values of a certain sign in each group, m, but lack information about the number of units in the population, so we cannot directly use the weighted arithmetic mean method to calculate. Instead, […]
Trial and error is a method of solving problems by repeated attempts.
Slack variables are auxiliary variables added when applying the soft interval method for classification. They are introduced to address the impact of outliers on classification.
Stochastic gradient descent (SGD) is an iterative solution approach of the gradient descent algorithm.
A surrogate function is a function that is used when the target function cannot be used or performs poorly.
The loss function is a metric used to measure the quality of a prediction model. It reflects the gap between the model's predicted value and the true value. It is the core part of the empirical risk function and also a component of the structural risk function. Common loss functions Log loss function Square loss function Exponential loss function Hinge loss function
Feature selection is the process of selecting feature subsets. It is usually used to build models. Its advantages are as follows: Simplify the model; Shorten the training time; Improve versatility and reduce overfitting. Feature selection algorithm can be regarded as a combination of search technology and evaluation index. The former provides candidate new feature subsets, and the latter is used to select different feature subsets.
The objective function refers to the form of the goal pursued expressed by the design variables and is a function of the design variables.
Reinforcement learning (RL) is an important branch of machine learning and a product of the intersection of multiple disciplines and fields. Its essence is to solve the decision-making problem, that is, to make decisions automatically and continuously.
The scoring function is the type of "score" available for the selected model. For example, the predicted value of the target, the probability of a predicted value, or the probability of a selected target value.
Singular value decomposition (SVD) is an important matrix decomposition method. The basis of eigenvector decomposition of symmetric arrays is spectral analysis, and singular value decomposition is the generalization of spectral analysis theory to arbitrary matrices.
Soft voting is also called weighted average probability voting. It is a voting method that uses the output class probability for classification. By inputting weights, the weighted average of the class probability of each class is obtained, and the class with the larger value will be selected.
Spectral clustering (SC) is a clustering method based on graph theory. It divides a weighted undirected graph into two or more optimal subgraphs, making the subgraphs as similar as possible and the distances between subgraphs as far as possible, so as to achieve the common clustering purpose.
Hard margin is the basis for selecting the segmentation hyperplane in support vector machine. It refers to the situation where the classification is completely accurate and there is no loss function, that is, the loss value is 0. It is only necessary to find the plane exactly in the middle of two heterogeneous classes. The opposite of hard margin is soft margin. Soft margin refers to allowing a certain amount of sample classification error, in which the optimization function includes two parts, […]
Smoothing is a commonly used data processing method.
The segmentation variable is the reference variable selected when doing spatial segmentation. It is a type of variable used for segmentation in classification problems to achieve optimal classification.
Support vector machine (SVM) is a supervised learning method for processing data in classification and regression analysis.
Soft margin maximization is an optimization method that uses soft margins to find the optimal solution.
Transfer learning is a method of using existing knowledge to learn new knowledge.
Artificial intelligence, also known as machine intelligence, refers to the intelligence displayed by machines created by humans. Usually, artificial intelligence refers to the technology that presents human intelligence through ordinary computer programs. Research topics The current research direction of artificial intelligence has been divided into several sub-fields. Researchers hope that artificial intelligence systems should have certain specific capabilities, […]
Oversampling refers to increasing the number of samples of a certain class in the training set to reduce class imbalance.
The average gradient refers to the average value of the grayscale change rate. It is used to indicate the clarity of an image, which is caused by the obvious difference in grayscale near the image boundary or both sides of the shadow line. It reflects the rate of change of the contrast of the tiny details of the image, that is, the rate of change of the density of the image in the multi-dimensional direction, and represents the relative clarity of the image. The average gradient is the image […]
Latent semantic analysis mainly discusses the relationship behind words, rather than the basis of dictionary definitions. This relationship is based on the actual usage environment of the words and takes this as the basic reference. This idea originated from psycholinguists who believed that there is a common mechanism in hundreds of languages in the world, and concluded that anyone in a specific language […]
The global minimum is the smallest of all points. The relative concept is the local minimum. If the error function has only one local minimum, then the local minimum is the global minimum. If the error function has multiple local minima, there is no guarantee that the solution is the global minimum. The method to find the global minimum is to find multiple local minima and take the minimum among them. […]