Basic Image Processing in Python

We will briefly talk about how to do basic image processing in Python using scikit-image package. You can click here for more information about scikit-image: https://scikit-image.org/docs/stable/user_guide.html.

scikit-image is an image processing toolbox for SciPy. You can also click here for another tutorial on image processing:http://scipy-lectures.org/advanced/image_processing/index.html

Some Basics on Image Storage

Check here for a brief intro to image storage on pixel and pixel values... https://www.whydomath.org/node/wavlets/imagebasics.html

Get Started

Let us install the pakage:

scikit-image comes pre-installed with several Python distributions, including Anaconda.

On all major operating systems, install it via shell/command prompt:

pip install scikit-image

If this does not work, try this pip install -U scikit-image

If you are running Anaconda or miniconda, use:

conda install -c conda-forge scikit-image

Note that scikit-image is an image processing Python package that works with numpy arrays. The package is imported as skimage:

In [40]:
import skimage

Most functions of skimage are found within submodules. You can click here for more info on those modules. https://scikit-image.org/docs/stable/api/api.html

Let us import the data set from skimage for an example. The skimage.data submodule provides a set of functions returning example images, that can be used to get started quickly on using scikit-image’s functions.

Note that within scikit-image, images are represented as NumPy arrays, for example 2-D arrays for grayscale 2-D images

In [41]:
from skimage import data
camera = data.camera()
In [42]:
type(camera)
Out[42]:
numpy.ndarray
In [43]:
# An image with 512 rows and 512 columns
camera.shape
Out[43]:
(512, 512)

Of course, it is also possible to load your own images as NumPy arrays from image files, using skimage.io.imread():

In [44]:
import os
cwd=os.getcwd()
filename = os.path.join(cwd, 'benson.JPG')
from skimage import io
benson = io.imread(filename)
type(benson)
Out[44]:
numpy.ndarray

Let us plot this image

In [45]:
import matplotlib.pyplot as plt
import matplotlib
%matplotlib inline
matplotlib.rcParams['font.size'] = 16

plt.figure()
plt.title("Zhang Family: Benson, Max, and Snow")
plt.imshow(benson)
plt.show()

You can use natsort to load multiple images https://pypi.org/project/natsort/

if you use pip install natsort to install this package.

The documentation is here: https://natsort.readthedocs.io/en/master/

In [46]:
import os
from natsort import natsorted, ns
from skimage import io
import glob
list_files = glob.glob('*JPG')
list_files
list_files = natsorted(list_files)
list_files
image_list = []
for filename in list_files:
      image_list.append(io.imread(filename))

Let us plot them all...

In [47]:
for image in image_list:
    plt.figure()
    plt.imshow(image)

plt.show()

A crash course on NumPy for images

Images in scikit-image are represented by NumPy ndarrays. Hence, many common operations can be achieved using standard NumPy methods for manipulating arrays. Let us get some basic ideas of numpy first.

You can click here for more details about numpy https://numpy.org/devdocs/user/quickstart.html

or https://cs231n.github.io/python-numpy-tutorial/

Disclosure: this part of tutorial relies heavily on https://numpy.org/devdocs/user/quickstart.html. You can click here for your reference.

The Basics

NumPy’s main object is the homogeneous multidimensional array. It is a table of elements (usually numbers), all of the same type, indexed by a tuple of non-negative integers. In NumPy dimensions are called axes.

For example, the coordinates of a point in 3D space [1, 2, 1] has one axis. That axis has 3 elements in it, so we say it has a length of 3. In the example pictured below, the array has 2 axes. The first axis has a length of 2, the second axis has a length of 3.

[[ 1., 0., 0.], [ 0., 1., 2.]]

NumPy’s array class is called ndarray. It is also known by the alias array. Note that numpy.array is not the same as the Standard Python Library class array.array, which only handles one-dimensional arrays and offers less functionality. The more important attributes of an ndarray object are:

ndarray.ndim the number of axes (dimensions) of the array.

ndarray.shape the dimensions of the array. This is a tuple of integers indicating the size of the array in each dimension. For a matrix with n rows and m columns, shape will be (n,m). The length of the shape tuple is therefore the number of axes, ndim.

ndarray.size the total number of elements of the array. This is equal to the product of the elements of shape.

ndarray.dtype an object describing the type of the elements in the array. One can create or specify dtype’s using standard Python types. Additionally NumPy provides types of its own. numpy.int32, numpy.int16, and numpy.float64 are some examples.

In [48]:
# let us see benson's dimension
benson.ndim
Out[48]:
3
In [49]:
benson.shape
Out[49]:
(1024, 809, 3)
In [50]:
benson.size
Out[50]:
2485248
In [51]:
benson.dtype
Out[51]:
dtype('uint8')

Array Creation

There are several ways to create arrays.

For example, you can create an array from a regular Python list or tuple using the array function. The type of the resulting array is deduced from the type of the elements in the sequences.

In [52]:
import numpy as np
a = np.array([2,3,4])
a
Out[52]:
array([2, 3, 4])
In [53]:
a.dtype
Out[53]:
dtype('int64')
In [54]:
b = np.array([1.2, 3.5, 5.1])
b.dtype
Out[54]:
dtype('float64')
In [55]:
#A frequent error consists in calling array with multiple arguments, rather than providing a single sequence as an argument.
a = np.array(1,2,3,4)    # WRONG
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-55-c929d0a5f287> in <module>
      1 #A frequent error consists in calling array with multiple arguments, rather than providing a single sequence as an argument.
----> 2 a = np.array(1,2,3,4)    # WRONG

TypeError: array() takes from 1 to 2 positional arguments but 4 were given
In [56]:
a = np.array([1,2,3,4])  # RIGHT
# array transforms sequences of sequences into two-dimensional arrays, sequences of sequences of sequences into three-dimensional arrays, and so on.
In [57]:
#The function zeros creates an array full of zeros, the function ones creates an array full of ones, and the function empty creates an array whose initial content is random and depends on the state of the memory. By default, the dtype of the created array is float64.
np.zeros((3, 4))                                # uninitialized
Out[57]:
array([[0., 0., 0., 0.],
       [0., 0., 0., 0.],
       [0., 0., 0., 0.]])
In [58]:
np.ones( (2,3,4), dtype=np.int16 )                # dtype can also be specified
Out[58]:
array([[[1, 1, 1, 1],
        [1, 1, 1, 1],
        [1, 1, 1, 1]],

       [[1, 1, 1, 1],
        [1, 1, 1, 1],
        [1, 1, 1, 1]]], dtype=int16)
In [59]:
np.empty( (2,3) ) 
Out[59]:
array([[   0.,  200.,  400.],
       [ 600.,  800., 1000.]])
In [60]:
#To create sequences of numbers, NumPy provides the arange function which is analogous to the Python built-in range, but returns an array.
np.arange( 10, 30, 5 )
Out[60]:
array([10, 15, 20, 25])
In [61]:
from numpy import pi
np.linspace( 0, 2, 9 )                 # 9 numbers from 0 to 2
Out[61]:
array([0.  , 0.25, 0.5 , 0.75, 1.  , 1.25, 1.5 , 1.75, 2.  ])

Basic Operations

Arithmetic operators on arrays apply elementwise. A new array is created and filled with the result.

In [62]:
a = np.array( [20,30,40,50] )
a
Out[62]:
array([20, 30, 40, 50])
In [63]:
b = np.arange( 4 )
b
Out[63]:
array([0, 1, 2, 3])
In [64]:
c = a-b
c
Out[64]:
array([20, 29, 38, 47])
In [65]:
b**2
Out[65]:
array([0, 1, 4, 9])
In [66]:
a<35
Out[66]:
array([ True,  True, False, False])

Unlike in many matrix languages, the product operator * operates elementwise in NumPy arrays. The matrix product can be performed using the @ operator (in python >=3.5) or the dot function or method:

In [67]:
A = np.array( [[1,1],
             [0,1]] )
B = np.array( [[2,0],
               [3,4]] )
A * B                       # elementwise product
Out[67]:
array([[2, 0],
       [0, 4]])
In [68]:
A @ B                       # matrix product
Out[68]:
array([[5, 4],
       [3, 4]])
In [69]:
A.dot(B)                    # another matrix product
Out[69]:
array([[5, 4],
       [3, 4]])

You can also reshape numpy array like this:

In [70]:
b = np.arange(12)
b
Out[70]:
array([ 0,  1,  2,  3,  4,  5,  6,  7,  8,  9, 10, 11])
In [71]:
b.reshape(3,4)
Out[71]:
array([[ 0,  1,  2,  3],
       [ 4,  5,  6,  7],
       [ 8,  9, 10, 11]])

Numpy with Skimage

In [72]:
from skimage import data
camera = data.camera()
type(camera)
Out[72]:
numpy.ndarray
In [73]:
camera.shape
Out[73]:
(512, 512)
In [74]:
camera.min(), camera.max()
Out[74]:
(0, 255)
In [75]:
camera.mean()
Out[75]:
118.31400299072266

NumPy arrays representing images can be of different integer or float numerical types. See Image data types and what they mean for more information about these types and how scikit-image treats them.https://scikit-image.org/docs/stable/user_guide/data_types.html#data-types

Image data types and what they mean In skimage, images are simply numpy arrays, which support a variety of data types 1, i.e. β€œdtypes”. To avoid distorting image intensities (see Rescaling intensity values), we assume that images use the following dtype ranges:

Data type Range

uint8: 0 to 255

uint16: 0 to 65535

uint32: 0 to $2^{32}$ - 1

float: -1 to 1 or 0 to 1

int8: -128 to 127

int16: -32768 to 32767

int32: -$2^{31}$ to $2^{31}$ - 1

Note that float images should be restricted to the range -1 to 1 even though the data type itself can exceed this range; all integer dtypes, on the other hand, have pixel intensities that can span the entire data type range. With a few exceptions, 64-bit (u)int images are not supported.

Functions in skimage are designed so that they accept any of these dtypes, but, for efficiency, may return an image of a different dtype (see Output types). If you need a particular dtype, skimage provides utility functions that convert dtypes and properly rescale image intensities (see Input types). You should never use astype on an image, because it violates these assumptions about the dtype range:

The following utility functions in the main package are available to developers and users:

Function name Description

img_as_float: Convert to 64-bit floating point.

img_as_ubyte: Convert to 8-bit uint.

img_as_uint: Convert to 16-bit uint.

img_as_int: Convert to 16-bit int.

These functions convert images to the desired dtype and properly rescale their values:

NumPy indexing

NumPy indexing can be used both for looking at the pixel values and to modify them:

In [76]:
# Get the value of the pixel at the 10th row and 20th column
benson[10, 20]
Out[76]:
array([202, 156, 122], dtype=uint8)
In [77]:
# Set to black the pixel at the 3rd row and 10th column
benson[3, 10] = 0

Be careful! In NumPy indexing, the first dimension (camera.shape[0]) corresponds to rows, while the second (camera.shape[1]) corresponds to columns, with the origin (camera[0, 0]) at the top-left corner.

Rescale, resize, and downscale

Rescale operation resizes an image by a given scaling factor. The scaling factor can either be a single floating point value, or multiple values - one along each axis.

Resize serves the same purpose, but allows to specify an output image shape instead of a scaling factor.

Downscale serves the purpose of down-sampling an n-dimensional image by integer factors using the local mean on the elements of each block of the size factors given as a parameter to the function.

In [78]:
import matplotlib.pyplot as plt
matplotlib.rcParams['font.size'] = 10
from skimage import data, color
from skimage.transform import rescale, resize, downscale_local_mean

image = color.rgb2gray(benson)

image_rescaled = rescale(image, 0.25)
image_resized = resize(image, (image.shape[0] // 4, image.shape[1] // 4))

fig, axes = plt.subplots(nrows=1, ncols=3)

ax = axes.ravel()
# check here for ravel function-it returns a contiguous flattened array.
# https://numpy.org/doc/stable/reference/generated/numpy.ravel.html

ax[0].imshow(image, cmap='gray')
ax[0].set_title("Original image")

ax[1].imshow(image_rescaled, cmap='gray')
ax[1].set_title("Rescaled image")

ax[2].imshow(image_resized, cmap='gray')
ax[2].set_title("Resized image")

ax[0].set_xlim(0, 512)
ax[0].set_ylim(512, 0)
plt.tight_layout()
plt.show()

Some Basics of Neural Network

In the lecture, we use logistic regression as an example to showcase how we use gradient descent to compute the weights and biases. Here let us use numpy to implement gradient descent...

Disclosure: this part of codes is adapted from https://ml-cheatsheet.readthedocs.io/en/latest/logistic_regression.html

In [79]:
import numpy as np # check here for numpy tutorial <https://numpy.org/devdocs/user/absolute_beginners.html>
import matplotlib.pyplot as plt
from sklearn.datasets import make_classification
import seaborn as sns

Let create a fake dataset for our classification task

In [80]:
features, labels = make_classification(n_samples=1000, n_features=2, n_redundant=0, n_informative=1,
                             n_clusters_per_class=1, random_state=2020)
# Josh' comments: Generate a random n-class classification problem.
# This initially creates clusters of points normally distributed (std=1) about vertices of 
# an n_informative-dimensional hypercube with sides of length 2*class_sep and assigns an equal number 
# of clusters to each class. It introduces interdependence between these features and 
# adds various types of further noise to the data.
In [81]:
features
Out[81]:
array([[ 1.05051896, -1.91803393],
       [ 0.3301913 , -2.04293774],
       [ 1.79117866, -3.10849906],
       ...,
       [-0.3926355 , -2.07746983],
       [ 0.21370577,  1.40554183],
       [ 0.08462641, -1.83679819]])
In [82]:
features.ndim
Out[82]:
2
In [83]:
labels
Out[83]:
array([0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 1, 1, 0, 1, 1, 1, 1, 1, 0,
       1, 1, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 1, 1, 1, 0, 0, 1, 0, 1, 1, 1,
       1, 1, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 1, 0, 1, 1, 1, 0, 1, 1, 0,
       1, 1, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0, 1, 0, 1, 1, 0, 1, 1, 1, 1, 0,
       1, 1, 0, 1, 0, 1, 0, 1, 0, 0, 1, 1, 1, 1, 1, 0, 1, 1, 1, 1, 0, 0,
       0, 1, 1, 0, 1, 1, 1, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 1, 0, 0, 1,
       0, 0, 0, 0, 1, 1, 1, 1, 0, 1, 0, 1, 0, 0, 1, 1, 1, 1, 0, 1, 0, 0,
       0, 1, 0, 1, 0, 0, 1, 1, 1, 0, 1, 1, 0, 1, 0, 1, 0, 0, 0, 0, 1, 1,
       1, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 1, 0, 0,
       0, 1, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0,
       1, 0, 1, 1, 1, 1, 0, 0, 1, 1, 0, 0, 1, 1, 1, 1, 1, 0, 0, 1, 1, 1,
       1, 0, 0, 0, 1, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0,
       0, 1, 1, 1, 1, 1, 0, 0, 1, 0, 1, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 0,
       0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 1, 1, 0, 1, 0, 0, 1, 1, 0,
       0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0,
       1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 1,
       1, 1, 1, 0, 1, 1, 0, 1, 1, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0,
       0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 1,
       1, 1, 1, 1, 1, 0, 1, 0, 1, 1, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 0, 0,
       1, 1, 1, 0, 0, 1, 0, 1, 1, 1, 0, 1, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1,
       1, 0, 1, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 1,
       1, 0, 1, 0, 0, 1, 0, 1, 1, 1, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1,
       1, 1, 0, 1, 0, 1, 0, 1, 1, 1, 1, 0, 1, 1, 0, 0, 1, 0, 1, 1, 1, 0,
       1, 0, 0, 1, 1, 1, 1, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 1, 0,
       0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 1, 0, 1, 0, 0, 0, 0, 1, 0, 1, 1,
       1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 0, 1, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0,
       1, 1, 0, 1, 1, 0, 1, 0, 1, 1, 1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 0, 1,
       1, 0, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 1, 1, 1, 0, 0,
       1, 1, 1, 1, 1, 0, 1, 0, 0, 0, 0, 1, 0, 1, 1, 1, 1, 0, 1, 1, 0, 0,
       1, 0, 0, 1, 1, 1, 0, 1, 1, 0, 0, 1, 1, 1, 1, 0, 0, 0, 1, 1, 0, 1,
       0, 0, 1, 0, 1, 1, 1, 0, 1, 1, 0, 0, 1, 0, 1, 1, 1, 0, 1, 1, 1, 1,
       1, 1, 1, 1, 0, 1, 0, 1, 1, 1, 1, 1, 1, 1, 0, 1, 0, 1, 0, 0, 0, 0,
       0, 0, 1, 0, 1, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0,
       1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1,
       0, 1, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1,
       0, 0, 1, 0, 0, 0, 0, 1, 1, 0, 1, 1, 1, 1, 1, 1, 0, 0, 1, 0, 1, 0,
       1, 0, 1, 0, 0, 1, 1, 1, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0,
       0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 0, 1, 0,
       1, 1, 0, 1, 1, 1, 0, 1, 1, 0, 0, 0, 1, 1, 1, 0, 1, 1, 1, 1, 0, 1,
       0, 0, 1, 1, 0, 1, 1, 1, 1, 0, 0, 1, 0, 1, 0, 1, 0, 1, 1, 1, 1, 1,
       0, 1, 0, 1, 1, 1, 0, 1, 1, 0, 0, 0, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0,
       0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 1, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 1,
       1, 0, 0, 1, 1, 1, 1, 0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 0, 1, 1, 1, 0,
       1, 0, 1, 1, 0, 1, 0, 1, 1, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0,
       0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 1, 0, 1, 1, 1,
       0, 1, 1, 1, 1, 0, 0, 0, 1, 0])
In [84]:
labels.ndim
Out[84]:
1
In [85]:
type(labels)
Out[85]:
numpy.ndarray
In [86]:
features.shape,labels.shape
Out[86]:
((1000, 2), (1000,))
In [87]:
labels = labels[:,np.newaxis]
# Josh' comments: numpy.newaxis is used to increase the dimension of the existing array by one more dimension
# When you read data, you should notice that lables are rank 1 arrays. rank 1 array will have a shape of (m, ) 
# where as rank 2 arrays will have a shape of (m,1). When operating on arrays its good to 
# convert rank 1 arrays to rank 2 arrays because rank 1 arrays often give unexpected results.
# To convert rank 1 to rank 2 array we use someArray[:,np.newaxis]
In [88]:
labels
Out[88]:
array([[0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [1],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [0],
       [1],
       [0],
       [0],
       [1],
       [1],
       [0],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0],
       [1],
       [1],
       [1],
       [0],
       [1],
       [1],
       [1],
       [1],
       [0],
       [0],
       [0],
       [1],
       [0]])
In [89]:
labels.ndim
Out[89]:
2
In [90]:
# Let us plot the data feature 1 by feature 2 grouped by labels..
sns.set_style('white')
sns.scatterplot(x=features[:,0],y=features[:,1],hue=labels.reshape(-1))
# you can click here to see parameters for scatterplot function <https://seaborn.pydata.org/generated/seaborn.scatterplot.html>
Out[90]:
<AxesSubplot:>
In [91]:
type(features)
Out[91]:
numpy.ndarray
In [92]:
features.ndim, features.size
Out[92]:
(2, 2000)
In [93]:
labels.ndim, labels.size
Out[93]:
(2, 1000)
In [94]:
features.shape,labels.shape
Out[94]:
((1000, 2), (1000, 1))

Logistic regression

$$ h_{\theta}(x) = {\sigma}(\theta^{T}x)$$

As we mentioned in the lecture, the activation funtion in logistic regression is the sigmoid function:

$$ {\sigma}(z)=\frac{1}{1+e^{−z}} $$

The function maps any real value into another value between 0 and 1. In machine learning, we use sigmoid to map predictions to probabilities.

You can see the distribution here:

In [95]:
# let us define sigmoid function
def sigmoid(z):
    return 1.0 / (1 + np.exp(-z))

We need to write a prediction function. A prediction function in logistic regression returns the probability of our observation being positive, True, or β€œYes”. We call this class 1 and its notation is 𝑃(π‘π‘™π‘Žπ‘ π‘ =1). As the probability gets closer to 1, our model is more confident that the observation is in class 1.

let us define $$z = \theta^{T}x$$ and ignore the biases in this tutorial

In [96]:
# let us define a prediction function
def predict(features, weights):
  '''
  Returns 1D array of probabilities
  that the class label == 1
  '''
  z = np.dot(features, weights)
  return sigmoid(z)

We need to define a cost function.

we use a cost function called Cross-Entropy, also known as Log Loss. Cross-entropy loss can be divided into two separate cost functions: one for 𝑦=1 and one for 𝑦=0.

Then let us combine these two components together:

$$ J(\theta) = \frac{1}{m}\sum_{i=1}^{m}\big[-y^{(i)}\, log\,( h_\theta\,(x^{(i)}))-(1-y^{(i)})\,log\,(1-h_\theta(x^{(i)}))\big]$$

Josh notes: I strongly suggest you to watch the lecture by Andrew NG ON COURERA regarding neural network basics and deep learning. Very intuitive.

Here is the vectorized version:

$$ J(\theta) = - \frac{1}{m}\big((\,log\,(g(X\theta))^Ty+(\,log\,(1-g(X\theta))^T(1-y)\big)$$

In [97]:
def cost_function(features, labels, weights):
    '''
    Using Mean Absolute Error

    Features:(1000,2)
    Labels: (1000,1)
    Weights:(2,1)
    Returns 1D matrix of predictions
    Cost = (labels*log(predictions) + (1-labels)*log(1-predictions) ) / len(labels)
    '''
    observations = len(labels)

    predictions = predict(features, weights)

    #Take the error when label=1
    class1_cost = -labels*np.log(predictions)

    #Take the error when label=0
    class2_cost = (1-labels)*np.log(1-predictions)

    #Take the sum of both costs
    cost = class1_cost - class2_cost

    #Take the average cost
    cost = cost.sum() / observations

    return cost

# Josh' comments: you can write a vectorized version of cost function
#  J = -1*(1/m)*(np.log(predictions).T.dot(labels)+np.log(1-predictions).T.dot(1-labels))

To minimize our cost, we use Gradient Descent just like before in Linear Regression. There are other more sophisticated optimization algorithms out there such as conjugate gradient like BFGS, but you don’t have to worry about these. Machine learning libraries like Scikit-learn have their implementations so you can focus on more interesting things!

The basic idea of gradient descent is that in order to minize the cost function, you slightly nudge your weights, and see how it influences your cost... and then update your weights....

$$ weights := weights - {learningrate} * slope $$

Partial derivative

$$ \frac{\delta J(\theta)}{\delta\theta_{j}} = \frac{1}{m}\sum_{i=1}^{m} ( h_\theta (x^{(i)})-y^{(i)})x^{(i)}_{j} $$

Vectorized

$$ \frac{\delta J(\theta)}{\delta\theta_{j}} = \frac{1}{m} X^T(g(X\theta)-y)$$

Repeat {

  1. Calculate gradient average
  2. Multiply by learning rate
  3. Subtract from weights

}

In [98]:
def update_weights(features, labels, weights, lr):
    '''
    Vectorized Gradient Descent
    '''
    N = len(features)

    #1 - Get Predictions
    predictions = predict(features, weights)

    #2 Transpose features from (1000, 2) to (2, 1000)
    # So we can multiply w the (1000,1)  cost matrix.
    # Returns a (2,1) matrix holding 3 partial derivatives --
    # one for each feature -- representing the aggregate
    # slope of the cost function across all observations
    gradient = np.dot(features.T,  predictions - labels)

    #3 Take the average cost derivative for each feature
    gradient /= N

    #4 - Multiply the gradient by our learning rate
    gradient *= lr

    #5 - Subtract from our weights to minimize cost
    weights -= gradient

    return weights

Time to train our model

In [99]:
def train(features, labels, weights, lr, iters):
    cost_history = []

    for i in range(iters):
        weights = update_weights(features, labels, weights, lr)

        #Calculate error for auditing purposes
        cost = cost_function(features, labels, weights)
        cost_history.append(cost)

        # Log Progress
        if i % 1000 == 0:
            print("iter: "+str(i) + " cost: "+str(cost))

    return weights, cost_history
In [100]:
lr = .01
iters = 10000
weights = np.zeros((features.ndim,1)) # we initinize our weights as zeros-- this is not wise but works fine for logistic
weights, cost_history = train(features,labels,weights,lr, iters)
iter: 0 cost: 0.6907503026719932
iter: 1000 cost: 0.28088507867909307
iter: 2000 cost: 0.25411236531525416
iter: 3000 cost: 0.2465118640785795
iter: 4000 cost: 0.24352972366738543
iter: 5000 cost: 0.24217536305489692
iter: 6000 cost: 0.24150773465410777
iter: 7000 cost: 0.2411613741647257
iter: 8000 cost: 0.24097548263143564
iter: 9000 cost: 0.24087334554935125
In [101]:
cost_history
Out[101]:
[0.6907503026719932,
 0.6883715331699282,
 0.6860107342960196,
 0.6836677686606087,
 0.6813424992628532,
 0.6790347895109066,
 0.6767445032415437,
 0.6744715047392255,
 0.6722156587546094,
 0.6699768305225076,
 0.6677548857792945,
 0.6655496907797687,
 0.6633611123134704,
 0.6611890177204629,
 0.6590332749065771,
 0.6568937523581287,
 0.6547703191561088,
 0.652662844989858,
 0.650571200170225,
 0.6484952556422198,
 0.646434882997163,
 0.6443899544843443,
 0.6423603430221884,
 0.6403459222089428,
 0.6383465663328908,
 0.636362150382096,
 0.6343925500536886,
 0.632437641762699,
 0.6304973026504475,
 0.6285714105924949,
 0.6266598442061684,
 0.6247624828576617,
 0.6228792066687259,
 0.6210098965229536,
 0.6191544340716669,
 0.6173127017394169,
 0.6154845827291028,
 0.6136699610267186,
 0.6118687214057368,
 0.6100807494311364,
 0.6083059314630828,
 0.6065441546602697,
 0.6047953069829304,
 0.6030592771955239,
 0.6013359548691117,
 0.5996252303834226,
 0.5979269949286224,
 0.5962411405067938,
 0.5945675599331313,
 0.5929061468368643,
 0.5912567956619122,
 0.5896194016672799,
 0.5879938609272037,
 0.5863800703310519,
 0.5847779275829899,
 0.583187331201415,
 0.5816081805181716,
 0.5800403756775492,
 0.5784838176350747,
 0.5769384081561036,
 0.5754040498142158,
 0.5738806459894269,
 0.5723681008662166,
 0.5708663194313842,
 0.5693752074717363,
 0.5678946715716123,
 0.5664246191102554,
 0.5649649582590333,
 0.563515597978516,
 0.5620764480154167,
 0.5606474188993977,
 0.5592284219397534,
 0.5578193692219696,
 0.5564201736041674,
 0.5550307487134367,
 0.5536510089420642,
 0.5522808694436594,
 0.5509202461291859,
 0.5495690556629006,
 0.5482272154582065,
 0.5468946436734227,
 0.5455712592074766,
 0.544256981695523,
 0.5429517315044923,
 0.5416554297285736,
 0.5403679981846368,
 0.5390893594075948,
 0.5378194366457137,
 0.5365581538558697,
 0.5353054356987613,
 0.5340612075340752,
 0.5328253954156131,
 0.5315979260863807,
 0.5303787269736413,
 0.529167726183939,
 0.5279648524980934,
 0.5267700353661674,
 0.5255832049024118,
 0.5244042918801924,
 0.5232332277268945,
 0.5220699445188157,
 0.5209143749760433,
 0.5197664524573218,
 0.5186261109549115,
 0.5174932850894401,
 0.5163679101047514,
 0.5152499218627479,
 0.5141392568382374,
 0.5130358521137769,
 0.5119396453745216,
 0.5108505749030776,
 0.509768579574361,
 0.5086935988504652,
 0.5076255727755371,
 0.5065644419706639,
 0.505510147628773,
 0.5044626315095441,
 0.5034218359343368,
 0.5023877037811341,
 0.501360178479503,
 0.5003392040055729,
 0.49932472487703383,
 0.4983166861481551,
 0.4973150334048242,
 0.49631971275960923,
 0.4953306708468434,
 0.494347854817734,
 0.4933712123354957,
 0.49240069157050964,
 0.49143624119550827,
 0.4904778103807872,
 0.4895253487894451,
 0.48857880657265,
 0.4876381343649367,
 0.48670328327953066,
 0.4857742049037039,
 0.4848508512941595,
 0.483933174972447,
 0.48302112892040966,
 0.48211466657566143,
 0.4812137418270977,
 0.4803183090104359,
 0.47942832290379017,
 0.4785437387232779,
 0.4776645121186594,
 0.47679059916901007,
 0.4759219563784268,
 0.47505854067176756,
 0.4742003093904239,
 0.47334722028812853,
 0.47249923152679607,
 0.4716563016723979,
 0.47081838969087114,
 0.46998545494406235,
 0.46915745718570495,
 0.46833435655743144,
 0.4675161135848187,
 0.4667026891734702,
 0.4658940446051297,
 0.46509014153383144,
 0.4642909419820835,
 0.46349640833708583,
 0.4627065033469827,
 0.46192119011714866,
 0.4611404321065098,
 0.4603641931238975,
 0.4595924373244377,
 0.4588251292059724,
 0.45806223360551646,
 0.45730371569574674,
 0.45654954098152545,
 0.4557996752964566,
 0.4550540847994755,
 0.45431273597147176,
 0.45357559561194455,
 0.4528426308356914,
 0.45211380906952864,
 0.4513890980490451,
 0.45066846581538744,
 0.4499518807120776,
 0.44923931138186274,
 0.44853072676359557,
 0.44782609608914764,
 0.4471253888803524,
 0.44642857494598065,
 0.4457356243787458,
 0.4450465075523404,
 0.4443611951185027,
 0.4436796580041141,
 0.4430018674083252,
 0.4423277947997135,
 0.44165741191346874,
 0.440990690748609,
 0.4403276035652253,
 0.43966812288175494,
 0.43901222147228336,
 0.4383598723638755,
 0.43771104883393364,
 0.43706572440758434,
 0.4364238728550929,
 0.43578546818930497,
 0.43515048466311546,
 0.4345188967669647,
 0.43389067922636104,
 0.43326580699943024,
 0.43264425527449085,
 0.43202599946765563,
 0.43141101522045955,
 0.4307992783975118,
 0.4301907650841745,
 0.4295854515842662,
 0.4289833144177898,
 0.4283843303186854,
 0.4277884762326074,
 0.4271957293147261,
 0.42660606692755254,
 0.42601946663878776,
 0.4254359062191951,
 0.4248553636404954,
 0.4242778170732863,
 0.42370324488498284,
 0.42313162563778167,
 0.42256293808664713,
 0.4219971611773196,
 0.42143427404434525,
 0.4208742560091281,
 0.42031708657800254,
 0.41976274544032854,
 0.4192112124666061,
 0.4186624677066113,
 0.4181164913875532,
 0.41757326391225047,
 0.4170327658573282,
 0.41649497797143453,
 0.4159598811734778,
 0.41542745655088176,
 0.4148976853578608,
 0.41437054901371506,
 0.4138460291011422,
 0.41332410736457,
 0.4128047657085062,
 0.41228798619590706,
 0.4117737510465636,
 0.411262042635506,
 0.4107528434914252,
 0.4102461362951129,
 0.40974190387791753,
 0.4092401292202183,
 0.4087407954499157,
 0.4082438858409388,
 0.4077493838117688,
 0.40725727292397973,
 0.40676753688079387,
 0.40628015952565477,
 0.40579512484081476,
 0.4053124169459387,
 0.404832020096723,
 0.4043539186835301,
 0.4038780972300376,
 0.4034045403919027,
 0.4029332329554412,
 0.40246415983632156,
 0.4019973060782722,
 0.4015326568518045,
 0.40107019745294886,
 0.40060991330200524,
 0.400151789942307,
 0.399695813038999,
 0.3992419683778286,
 0.39879024186395023,
 0.398340619520743,
 0.3978930874886414,
 0.3974476320239787,
 0.39700423949784364,
 0.3965628963949486,
 0.3961235893125114,
 0.3956863049591484,
 0.3952510301537804,
 0.39481775182455,
 0.3943864570077513,
 0.39395713284677136,
 0.3935297665910423,
 0.3931043455950063,
 0.39268085731709057,
 0.3922592893186946,
 0.3918396292631878,
 0.3914218649149182,
 0.3910059841382321,
 0.3905919748965044,
 0.39017982525117906,
 0.389769523360821,
 0.38936105748017696,
 0.3889544159592477,
 0.3885495872423699,
 0.38814655986730756,
 0.38774532246435406,
 0.3873458637554434,
 0.386948172553271,
 0.38655223776042463,
 0.3861580483685242,
 0.3857655934573705,
 0.38537486219410433,
 0.38498584383237333,
 0.38459852771150893,
 0.3842129032557112,
 0.3838289599732434,
 0.383446687455634,
 0.3830660753768889,
 0.3826871134927104,
 0.3823097916397255,
 0.38193409973472287,
 0.3815600277738963,
 0.3811875658320987,
 0.3808167040621013,
 0.38044743269386344,
 0.38007974203380807,
 0.37971362246410606,
 0.37934906444196786,
 0.3789860584989426,
 0.3786245952402248,
 0.3782646653439683,
 0.3779062595606078,
 0.3775493687121867,
 0.3771939836916933,
 0.3768400954624032,
 0.3764876950572286,
 0.376136773578075,
 0.3757873221952043,
 0.37543933214660474,
 0.3750927947373673,
 0.3747477013390688,
 0.374404043389162,
 0.3740618123903705,
 0.373720999910092,
 0.37338159757980666,
 0.37304359709449136,
 0.3727069902120414,
 0.37237176875269634,
 0.3720379245984742,
 0.37170544969260905,
 0.3713743360389963,
 0.371044575701643,
 0.370716160804124,
 0.3703890835290436,
 0.3700633361175029,
 0.36973891086857263,
 0.36941580013877184,
 0.3690939963415508,
 0.36877349194678105,
 0.36845427948024884,
 0.368136351523155,
 0.36781970071161946,
 0.3675043197361913,
 0.3671902013413632,
 0.3668773383250912,
 0.36656572353831945,
 0.36625534988450986,
 0.3659462103191763,
 0.3656382978494237,
 0.3653316055334916,
 0.3650261264803033,
 0.36472185384901795,
 0.3644187808485891,
 0.3641169007373259,
 0.36381620682245996,
 0.3635166924597165,
 0.36321835105288924,
 0.36292117605342,
 0.36262516095998254,
 0.36233029931807054,
 0.36203658471958966,
 0.3617440108024538,
 0.3614525712501853,
 0.36116225979151956,
 0.3608730702000127,
 0.36058499629365426,
 0.36029803193448334,
 0.3600121710282079,
 0.35972740752382887,
 0.35944373541326763,
 0.3591611487309972,
 0.35887964155367647,
 0.3585992079997898,
 0.35831984222928753,
 0.3580415384432332,
 0.3577642908834512,
 0.35748809383218044,
 0.35721294161172956,
 0.35693882858413667,
 0.3566657491508321,
 0.3563936977523045,
 0.35612266886776983,
 0.3558526570148444,
 0.35558365674922043,
 0.3553156626643451,
 0.35504866939110236,
 0.3547826715974987,
 0.3545176639883513,
 0.3542536413049787,
 0.3539905983248961,
 0.35372852986151204,
 0.3534674307638288,
 0.3532072959161457,
 0.352948120237765,
 0.3526898986827007,
 0.3524326262393905,
 0.35217629793040983,
 0.3519209088121896,
 0.3516664539747358,
 0.3514129285413523,
 0.351160327668366,
 0.35090864654485515,
 0.3506578803923794,
 0.3504080244647133,
 0.35015907404758173,
 0.34991102445839833,
 0.3496638710460063,
 0.3494176091904212,
 0.34917223430257693,
 0.34892774182407343,
 0.34868412722692754,
 0.3484413860133254,
 0.34819951371537783,
 0.3479585058948778,
 0.34771835814305996,
 0.3474790660803629,
 0.3472406253561932,
 0.3470030316486922,
 0.34676628066450416,
 0.3465303681385477,
 0.3462952898337886,
 0.34606104154101447,
 0.34582761907861276,
 0.34559501829234945,
 0.34536323505515065,
 0.34513226526688595,
 0.34490210485415407,
 0.34467274977006973,
 0.3444441959940537,
 0.3442164395316238,
 0.3439894764141879,
 0.34376330269884003,
 0.34353791446815624,
 0.34331330782999464,
 0.3430894789172954,
 0.34286642388788413,
 0.3426441389242756,
 0.34242262023348047,
 0.3422018640468134,
 0.3419818666197024,
 0.34176262423150083,
 0.3415441331853003,
 0.3413263898077458,
 0.3411093904488526,
 0.3408931314818238,
 0.3406776093028714,
 0.3404628203310367,
 0.34024876100801477,
 0.3400354277979782,
 0.33982281718740404,
 0.339610925684902,
 0.33939974982104343,
 0.3391892861481929,
 0.33897953124034025,
 0.3387704816929354,
 0.3385621341227232,
 0.3383544851675812,
 0.3381475314863577,
 0.33794126975871164,
 0.33773569668495473,
 0.33753080898589344,
 0.3373266034026736,
 0.3371230766966263,
 0.33692022564911406,
 0.33671804706138014,
 0.3365165377543975,
 0.33631569456872024,
 0.3361155143643359,
 0.33591599402051886,
 0.33571713043568546,
 0.33551892052725046,
 0.3353213612314841,
 0.335124449503371,
 0.33492818231647037,
 0.334732556662777,
 0.3345375695525837,
 0.33434321801434497,
 0.33414949909454206,
 0.3339564098575485,
 0.33376394738549764,
 0.3335721087781512,
 0.3333808911527681,
 0.33319029164397557,
 0.33300030740364095,
 0.3328109356007442,
 0.3326221734212521,
 0.33243401806799333,
 0.33224646676053443,
 0.33205951673505707,
 0.33187316524423643,
 0.3316874095571203,
 0.33150224695900965,
 0.3313176747513398,
 0.3311336902515628,
 0.3309502907930313,
 0.3307674737248822,
 0.3305852364119224,
 0.33040357623451566,
 0.33022249058846875,
 0.3300419768849211,
 0.32986203255023305,
 0.3296826550258763,
 0.3295038417683253,
 0.329325590248949,
 0.3291478979539039,
 0.32897076238402817,
 0.3287941810547355,
 0.3286181514959122,
 0.32844267125181226,
 0.32826773788095565,
 0.32809334895602627,
 0.32791950206377124,
 0.32774619480490064,
 0.3275734247939884,
 0.3274011896593738,
 0.3272294870430642,
 0.32705831460063817,
 0.32688767000114943,
 0.32671755092703175,
 0.3265479550740048,
 0.3263788801509804,
 0.3262103238799698,
 0.32604228399599167,
 0.3258747582469808,
 0.32570774439369776,
 0.32554124020963887,
 0.3253752434809476,
 0.3252097520063256,
 0.3250447635969462,
 0.32488027607636627,
 0.3247162872804415,
 0.32455279505723983,
 0.3243897972669574,
 0.324227291781834,
 0.32406527648607064,
 0.32390374927574567,
 0.323742708058734,
 0.32358215075462476,
 0.32342207529464156,
 0.3232624796215616,
 0.32310336168963705,
 0.32294471946451603,
 0.32278655092316455,
 0.32262885405378894,
 0.3224716268557594,
 0.32231486733953363,
 0.32215857352658095,
 0.3220027434493077,
 0.321847375150983,
 0.3216924666856643,
 0.32153801611812516,
 0.3213840215237816,
 0.3212304809886209,
 0.3210773926091299,
 0.320924754492224,
 0.32077256475517707,
 0.3206208215255514,
 0.3204695229411288,
 0.32031866714984186,
 0.320168252309706,
 0.3200182765887516,
 0.319868738164957,
 0.3197196352261823,
 0.31957096597010337,
 0.31942272860414606,
 0.3192749213454215,
 0.3191275424206617,
 0.3189805900661557,
 0.3188340625276857,
 0.3186879580604648,
 0.3185422749290741,
 0.318397011407401,
 0.31825216577857757,
 0.3181077363349199,
 0.3179637213778669,
 0.31782011921792125,
 0.31767692817458915,
 0.3175341465763213,
 0.31739177276045455,
 0.3172498050731534,
 0.3171082418693525,
 0.3169670815126991,
 0.3168263223754967,
 0.3166859628386478,
 0.31654600129159904,
 0.31640643613228464,
 0.31626726576707165,
 0.31612848861070564,
 0.315990103086256,
 0.3158521076250619,
 0.31571450066667944,
 0.31557728065882834,
 0.31544044605733895,
 0.31530399532610054,
 0.3151679269370094,
 0.31503223936991687,
 0.31489693111257916,
 0.31476200066060595,
 0.31462744651741054,
 0.31449326719415965,
 0.31435946120972413,
 0.3142260270906298,
 0.31409296337100867,
 0.3139602685925503,
 0.313827941304454,
 0.3136959800633814,
 0.3135643834334085,
 0.3134331499859792,
 0.31330227829985835,
 0.31317176696108595,
 0.31304161456293084,
 0.3129118197058451,
 0.31278238099741906,
 0.3126532970523364,
 0.31252456649232935,
 0.3123961879461346,
 0.3122681600494494,
 0.312140481444888,
 0.3120131507819384,
 0.3118861667169193,
 0.31175952791293765,
 0.3116332330398462,
 0.31150728077420164,
 0.3113816697992228,
 0.31125639880474953,
 0.31113146648720114,
 0.3110068715495361,
 0.31088261270121154,
 0.31075868865814266,
 0.3106350981426633,
 0.31051183988348635,
 0.31038891261566404,
 0.31026631508054925,
 0.3101440460257569,
 0.31002210420512516,
 0.30990048837867745,
 0.3097791973125848,
 0.30965822977912777,
 0.3095375845566596,
 0.3094172604295685,
 0.3092972561882416,
 0.30917757062902795,
 0.30905820255420247,
 0.3089391507719295,
 0.30882041409622785,
 0.3087019913469345,
 0.30858388134966985,
 0.3084660829358027,
 0.30834859494241507,
 0.3082314162122684,
 0.3081145455937689,
 0.3079979819409336,
 0.30788172411335646,
 0.3077657709761754,
 0.3076501214000384,
 0.3075347742610708,
 0.30741972844084225,
 0.3073049828263347,
 0.3071905363099095,
 0.3070763877892754,
 0.3069625361674567,
 0.30684898035276204,
 0.30673571925875226,
 0.30662275180420956,
 0.30651007691310633,
 0.30639769351457496,
 0.3062856005428764,
 0.3061737969373705,
 0.3060622816424856,
 0.30595105360768865,
 0.3058401117874556,
 0.3057294551412416,
 0.3056190826334521,
 0.30550899323341374,
 0.30539918591534504,
 0.30528965965832816,
 0.30518041344628016,
 0.30507144626792493,
 0.3049627571167649,
 0.3048543449910533,
 0.30474620889376614,
 0.30463834783257526,
 0.30453076081982067,
 0.3044234468724831,
 0.30431640501215784,
 0.3042096342650273,
 0.3041031336618346,
 0.30399690223785747,
 0.3038909390328818,
 0.3037852430911754,
 0.3036798134614628,
 0.3035746491968988,
 0.3034697493550434,
 0.3033651129978366,
 0.3032607391915728,
 0.30315662700687623,
 0.30305277551867593,
 0.3029491838061814,
 0.3028458509528573,
 0.30274277604640043,
 0.3026399581787143,
 0.3025373964458863,
 0.3024350899481627,
 0.30233303778992615,
 0.30223123907967137,
 0.3021296929299822,
 0.3020283984575082,
 0.30192735478294175,
 0.3018265610309953,
 0.3017260163303785,
 0.3016257198137755,
 0.30152567061782265,
 0.3014258678830867,
 0.3013263107540417,
 0.301226998379048,
 0.30112792991032994,
 0.30102910450395415,
 0.3009305213198083,
 0.3008321795215794,
 0.30073407827673293,
 0.3006362167564912,
 0.300538594135813,
 0.3004412095933723,
 0.30034406231153776,
 0.3002471514763524,
 0.30015047627751257,
 0.30005403590834834,
 0.29995782956580297,
 0.299861856450413,
 0.2997661157662885,
 0.2996706067210928,
 0.29957532852602387,
 0.29948028039579355,
 0.2993854615486094,
 0.2992908712061546,
 0.29919650859356933,
 0.2991023729394315,
 0.299008463475738,
 0.2989147794378861,
 0.29882132006465445,
 0.2987280845981851,
 0.29863507228396463,
 0.29854228237080627,
 0.29844971411083154,
 0.2983573667594523,
 0.29826523957535284,
 0.29817333182047207,
 0.29808164275998605,
 0.29799017166228986,
 0.2978989177989811,
 0.29780788044484147,
 0.29771705887782046,
 0.29762645237901764,
 0.29753606023266604,
 0.297445881726115,
 0.29735591614981355,
 0.2972661627972935,
 0.29717662096515346,
 0.29708728995304134,
 0.2969981690636392,
 0.29690925760264597,
 0.2968205548787621,
 0.2967320602036728,
 0.29664377289203264,
 0.29655569226144946,
 0.2964678176324687,
 0.29638014832855775,
 0.2962926836760902,
 0.29620542300433084,
 0.29611836564541977,
 0.29603151093435776,
 0.29594485820899036,
 0.2958584068099936,
 0.2957721560808584,
 0.2956861053678761,
 0.2956002540201235,
 0.29551460138944824,
 0.295429146830454,
 0.29534388970048664,
 0.2952588293596188,
 0.2951739651706364,
 0.295089296499024,
 0.2950048227129509,
 0.2949205431832569,
 0.2948364572834383,
 0.29475256438963426,
 0.29466886388061275,
 0.294585355137757,
 0.29450203754505183,
 0.29441891048907004,
 0.2943359733589591,
 0.29425322554642763,
 0.2941706664457323,
 0.2940882954536644,
 0.29400611196953674,
 0.29392411539517094,
 0.293842305134884,
 0.2937606805954757,
 0.29367924118621563,
 0.29359798631883066,
 0.29351691540749214,
 0.2934360278688034,
 0.29335532312178714,
 0.2932748005878732,
 0.293194459690886,
 0.2931142998570326,
 0.2930343205148899,
 0.2929545210953935,
 0.2928749010318245,
 0.2927954597597985,
 0.29271619671725296,
 0.2926371113444361,
 0.29255820308389446,
 0.2924794713804617,
 0.29240091568124665,
 0.29232253543562214,
 0.29224433009521306,
 0.29216629911388536,
 0.29208844194773464,
 0.2920107580550745,
 0.291933246896426,
 0.29185590793450594,
 0.29177874063421616,
 0.2917017444626323,
 0.2916249188889931,
 0.2915482633846894,
 0.2914717774232533,
 0.2913954604803473,
 0.2913193120337543,
 0.2912433315633658,
 0.29116751855117273,
 0.29109187248125373,
 0.29101639283976566,
 0.29094107911493267,
 0.2908659307970361,
 0.2907909473784042,
 0.29071612835340227,
 0.29064147321842176,
 0.29056698147187116,
 0.29049265261416535,
 0.29041848614771587,
 0.2903444815769212,
 0.29027063840815653,
 0.29019695614976443,
 0.29012343431204496,
 0.29005007240724584,
 0.28997686994955324,
 0.28990382645508195,
 0.2898309414418661,
 0.28975821442984934,
 0.2896856449408762,
 0.2896132324986818,
 0.28954097662888373,
 0.2894688768589717,
 0.2893969327182993,
 0.2893251437380742,
 0.28925350945134976,
 0.28918202939301557,
 0.28911070309978837,
 0.28903953011020395,
 0.2889685099646074,
 0.2888976422051448,
 0.2888269263757545,
 0.2887563620221583,
 0.2886859486918528,
 0.2886156859341009,
 0.28854557329992314,
 0.28847561034208946,
 0.2884057966151105,
 0.28833613167522926,
 0.28826661508041285,
 0.2881972463903443,
 0.28812802516641384,
 0.28805895097171147,
 0.2879900233710179,
 0.28792124193079716,
 0.2878526062191884,
 0.28778411580599755,
 0.28771577026268946,
 0.2876475691623806,
 0.2875795120798301,
 0.28751159859143277,
 0.287443828275211,
 0.287376200710807,
 0.28730871547947506,
 0.28724137216407414,
 0.28717417034906023,
 0.2871071096204781,
 0.2870401895659548,
 0.2869734097746916,
 0.28690676983745644,
 0.286840269346577,
 0.2867739078959329,
 0.2867076850809485,
 0.28664160049858584,
 0.286575653747337,
 0.28650984442721744,
 0.28644417213975826,
 0.28637863648799966,
 0.2863132370764832,
 0.28624797351124565,
 0.2861828453998109,
 0.286117852351184,
 0.2860529939758436,
 0.28598826988573556,
 0.28592367969426563,
 0.28585922301629285,
 0.2857948994681228,
 0.2857307086675011,
 0.2856666502336062,
 0.28560272378704316,
 0.28553892894983685,
 0.28547526534542533,
 0.2854117325986535,
 0.2853483303357664,
 0.2852850581844028,
 0.2852219157735888,
 0.28515890273373157,
 0.2850960186966124,
 0.28503326329538103,
 0.2849706361645493,
 0.28490813693998435,
 0.2848457652589029,
 0.2847835207598649,
 0.28472140308276733,
 0.2846594118688381,
 0.28459754676063004,
 0.2845358074020146,
 0.28447419343817626,
 0.2844127045156059,
 0.28435134028209563,
 0.2842901003867321,
 0.28422898447989103,
 0.2841679922132314,
 0.2841071232396893,
 0.28404637721347237,
 0.2839857537900539,
 0.2839252526261674,
 0.2838648733798003,
 0.28380461571018883,
 0.2837444792778122,
 0.28368446374438683,
 0.2836245687728609,
 0.283564794027409,
 0.283505139173426,
 0.28344560387752227,
 0.2833861878075178,
 0.2833268906324367,
 0.2832677120225021,
 0.28320865164913067,
 0.28314970918492705,
 0.2830908843036786,
 0.2830321766803507,
 0.2829735859910804,
 0.28291511191317203,
 0.28285675412509176,
 0.28279851230646236,
 0.2827403861380581,
 0.2826823753017993,
 0.28262447948074804,
 0.2825666983591023,
 0.28250903162219115,
 0.2824514789564698,
 0.2823940400495148,
 0.28233671459001847,
 0.28227950226778475,
 0.2822224027737236,
 0.28216541579984655,
 0.2821085410392616,
 0.28205177818616844,
 0.28199512693585393,
 0.28193858698468677,
 0.28188215803011313,
 0.28182583977065184,
 0.28176963190588966,
 0.2817135341364764,
 0.28165754616412075,
 0.28160166769158507,
 0.28154589842268124,
 0.2814902380622658,
 0.28143468631623547,
 0.2813792428915226,
 0.28132390749609054,
 0.2812686798389295,
 0.2812135596300517,
 0.2811585465804871,
 0.2811036404022788,
 0.281048840808479,
 0.2809941475131443,
 0.2809395602313315,
 ...]

We need to evalute our model

If our model is working, we should see our cost decrease after every iteration.

In [102]:
plt.plot(list(range(iters)), cost_history, '-r') 
Out[102]:
[<matplotlib.lines.Line2D at 0x7fd438f26700>]

We can also compute the accuracy

Accuracy measures how correct our predictions were. In this case we simply compare predicted labels to true labels and divide by the total.

Let us Map probabilities to classes

The final step is assign class labels (0 or 1) to our predicted probabilities.

In [103]:
# set up decision boundary
def decision_boundary(prob):
  return 1 if prob >= .5 else 0

# Convert probabilities to classes
def classify(predictions):
  '''
  input  - N element array of predictions between 0 and 1
  output - N element array of 0s (False) and 1s (True)
  '''
  vec_decision_boundary = np.vectorize(decision_boundary)
  return vec_decision_boundary(predictions).flatten()
In [104]:
def accuracy(predicted_labels, actual_labels):
    diff = predicted_labels - actual_labels
    return 1.0 - (float(np.count_nonzero(diff)) / len(diff))

pred_prob = predict(features,weights)
predicted_labels = classify(pred_prob)
predicted_labels.ndim
accuracy(predicted_labels,labels.flatten())
Out[104]:
0.917