# Machine Learning in Hyderabad

**Machine Learning Course in Hyderabad**

**Machine learning is the use of man-made reasoning (AI)** that gives frameworks the capacity to naturally take in and enhance for a fact without being unequivocally customized. Machine learning centers around the improvement of PC programs that can get to information and utilize it learn for themselves.

*Machine learning algorithms are often categorized as supervised or unsupervised.*

The way toward learning starts with perceptions or information, for example, precedents, coordinate understanding, or guidance, with the end goal to search for examples in information and settle on better choices later on dependent on the models that we give. The essential point is to enable the PCs to learn naturally without human mediation or help and modify activities as needs are.

__Steps involved in fitting a machine learning model to a data set.__

- Suppose you are provided with a data-set that has
in square feet and the respective**an area of the house****price** - How do you think you will come up with a Machine Learning Model to learn the data and use this model to
of a random house given the area?**predict the price**

You will learn that in the following cards.

Let’s get started…

House Price Prediction

- We have a data-set consisting of houses with their
and their respective*area in sq feet**prices* - Assume that the
are*prices*on the*dependent*of the house*area* - Let us learn how to represent this idea in
*Machine Learning parlance*

__ __

### ML Notations

- The input / independent variables are denoted by ‘x’
- The output / dependent variable(s) are denoted by ‘y’

In our problem the * area* in sq foot values is ‘x’ and house

*are ‘y’.*

**prices**Here the change in one variable is dependent on the change in another variable. This technique is called * Regression*.

- The objective is, given a set of training data, the algorithm needs to come up with a way to map ‘x’ to ‘y’
- This is denoted by h: X → Y

h(x) is called the **hypothesis** that does the **mapping**.

#### Why Cost Function?

- You have learned how to map the input and output variables through the
**hypothesis function**in the previous example. - After defining the hypothesis function, the accuracy of the function has to be determined to gauge the
**predictive power**. i.e., how are the square feet values predicting the housing prices accurately? - The cost function is the representation of this objective.

Demystifying Cost Function

In the cost function,

- m – a number of observations
- y^- predicted value
- y – actual value
- i – single observation

The **objective** is to **minimize** the **sum of squared errors** between the **predicted** and actual values.

##### Cost Function Intuition

__Gradient descent Explained__

- Imagine the error surface as a 3D image with parameters theta0 and theta1 on x and y-axis and Error value in the z-axis
- The intuition behind gradient descent is to choose the parameters that minimize the cost as low as possible
- Descending down the cost function is made in the direction of the steepest descent
- The learning parameter(alpha) decides the magnitude of each step

Convergence

- If the
**learning rate**is**small**then the convergence**takes time** - If the
**learning rate**is**high**the values**overshoot**

Initializing the right learning rate is very important.

Multiple Features

- For theoretical purposes,
**a single variable**is used for illustration. But practically,**multiple features / independent variables**are used for predicting a variable.

- In the first example, you saw how housing prices were predicted based on their sq feet value. But ideally, problems can get more
**complex**and have**multiple features**required to map the output.

Hypothesis Representation

- θ0 could be the basic price of a house
- θ1 could be the price per square feet
- θ2 the price per level
- x1 could be the area of square feet in the house
- x2 could be the number of floors

Why Feature Scaling?

When there are **multiple features** and each feature variable has a **large magnitude**, combining them into a model and predicting a value becomes **computationally intensive**.

**Scaling** comes to our help in these scenarios.

##### What is Feature Scaling?

- In scaling, each feature/variable is divided by its
**range (maximum minus minimum)** - The result of scaling is a variable in the range of 1
- This eases the computation intensity to a considerable extent

##### Mean Normalization

- In
**mean normalization**, the**mean**of each variable is**subtracted**from the variable - In many cases, the
**mean normalization**and**scaling**are performed**together**