In the Roman Catholic Church, this 'porter' became the lowest of the four minor orders prescribed by the Council of Trent. This article was published as a part of the Data Science Blogathon. An ostiarius, a Latin word sometimes anglicized as ostiary but often literally translated as porter or doorman, originally was a servant or guard posted at the entrance of a building. IntroductionĮntropy is one of the key aspects of Machine Learning. ![]() It is a must to know for anyone who wants to make a mark in Machine Learning and yet it perplexes many of us. The focus of this article is to understand the working of entropy by exploring the underlying concept of probability theory, how the formula works, its significance, and why it is important for the Decision Tree algorithm. The term entropy was first coined by the German physicist and mathematician Rudolf Clausius and was used in the field of thermodynamics. Shannon, mathematician, and electrical engineer, published a paper on A Mathematical Theory of Communication, in which he had addressed the issues of measure of information, choice, and uncertainty. Shannon was also known as the ‘father of information theory’ as he had invented the field of information theory. “ Information theory is a mathematical approach to the study of coding of information along with the quantification, storage, and communication of information.” In his paper, he had set out to mathematically measure the statistical nature of “lost information” in phone-line signals. The work was aimed at the problem of how best to encode the information a sender wants to transmit. For this purpose, information entropy was developed as a way to estimate the information content in a message that is a measure of uncertainty reduced by the message. So, we know that the primary measure in information theory is entropy. Let’s look at this concept in depth.īut first things first, what is this information? What ‘information’ am I referring to? The English meaning of the word entropy is: it is a state of disorder, confusion, and disorganization. In simple words, we know that information is some facts learned about something or someone. Notionally, we can understand that information is something that can be stored in, transferred, or passed-on as variables, which can further take different values. In other words, a variable is nothing but a unit of storage. ![]() ![]() So, we get information from a variable by seeing its value, in the same manner as we get details (or information) from a message or letter by reading its content. The entropy measures the “amount of information” present in a variable. Now, this amount is estimated not only based on the number of different values that are present in the variable but also by the amount of surprise that this value of the variable holds.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |