14 Loss functions you can use for Regression
In mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event. An optimization problem seeks to minimize a loss function. An objective function is either a loss function or its opposite (in specific domains, variously called a reward function, a profit function, a utility function, a fitness function, etc.), in which case it is to be maximized. The loss function could include terms from several levels of the hierarchy. The kind of loss function you are going to use depends on the kind of problem you are working i.e Regression or Classification.
Jan-21-2023, 12:05:32 GMT
- Technology: