Why and How to Use “get_dummies” (One-Hot Encoding) in Machine Learning

You may know that machines work better with numbers. There is an interesting approach to transform label data to numeric values in Python. Being more specific about the technique — a way to convert the information into indicator/binary type 0 or 1. By “label data”, or also called “categorical data”, I mean values represented as a string such as “male”, “female”, used for gender. Or “blue”, “orange”, “red” for colours. Or whatever it may be

3 Java Champions To Be Inspired By and What They Have in Common

I can only imagine being a Java Champion is the pinnacle of a Java Developer career. It means, among many other things, the confirmation that you are Java passionate, a leader, and very capable technically. It’s needless to say, but saying anyways — not every Java Developer aspires to be a Java Champion, and they don’t need to be, of course. But all Java Developers, beginners or seniors, have to appreciate the journey and hard work that it takes to get there.

TF-IDF: Less Is More

You may have encountered this acronym if you’re studying Machine Learning (ML) and specifically Natural Language Processing (NLP). “TF-IDF”. I keep finding it in every research paper I read about automatic text summarization

Java 17: What’s New, Removed and Preview in JDK 17

A new API to replace Java Native Interface (JNI) and enhancements in switch expressions and statements (preview) are among the features. The JDK 17 is a long-term support (LTS) version, which means it has Oracle support for many years to come. Other LTS JDKs are Java 8 and Java 11. LTS JDKs are released every three years and non-LTS every six months

Are Apache Struts and JavaServer Faces (JSF) Still Around?

JavaServer Faces (JSF) and Struts. Do you remember them? The other day, I talked to a friend about how the web frameworks we used in the mid-to-late 2000s, such as Struts and JSF, are so different from the ones most people are using today — for example, React and Angular. They’re nothing alike, right?

Bayes Theorem: The Basis for Self-Driving Cars and Other Machine Learning Applications

Bayes theorem, invented by Thomas Bayes in the 18th century, describes a simple and powerful methodology for calculating the probability of a belief/hypothesis occurring given a new piece of evidence/observation. Throughout history, the Bayes theorem has been applied to find nuclear bombs and is the basis for machine learning algorithms (classifiers). It’s used in spam filtering, self-driving cars, to access financial risk and more. The algorithms can accurately identify the probability of an event occurring and therefore make good decisions

Blog at WordPress.com.

Up ↑