Logistic layer
- class layers.logistic_layer.Logistic_layer(input_shape=None)[source]
Bases:
NumPyNet.layers.base.BaseLayer
Logistic Layer: performs a logistic transformation of the input and computes the binary cross entropy cost.
- input_shapetuple (default=None)
Shape of the input in the format (batch, w, h, c), None is used when the layer is part of a Network model.
>>> import os >>> >>> import pylab as plt >>> from PIL import Image >>> >>> img_2_float = lambda im : ((im - im.min()) * (1./(im.max() - im.min()) * 1.)).astype(float) >>> float_2_img = lambda im : ((im - im.min()) * (1./(im.max() - im.min()) * 255.)).astype(np.uint8) >>> >>> filename = os.path.join(os.path.dirname(__file__), '..', '..', 'data', 'dog.jpg') >>> inpt = np.asarray(Image.open(filename), dtype=float) >>> inpt.setflags(write=1) >>> inpt = img_2_float(inpt) >>> inpt = inpt * 2. - 1. >>> >>> inpt = np.expand_dims(inpt, axis=0) >>> >>> np.random.seed(123) >>> batch, w, h, c = inpt.shape >>> >>> # truth definition, it's random so don't expect much >>> truth = np.random.choice([0., 1.], p=[.5, .5], size=(batch, w, h, c)) >>> >>> # Model Initialization >>> layer = Logistic_layer(input_shape=inpt.shape) >>> >>> # FORWARD >>> >>> layer.forward(inpt, truth) >>> forward_out = layer.output >>> layer_loss = layer.cost >>> >>> print(layer) >>> print('Loss: {:.3f}'.format(layer_loss)) >>> >>> # BACKWARD >>> >>> delta = np.zeros(shape=inpt.shape, dtype=float) >>> layer.backward(delta) >>> >>> # Visualizations >>> >>> fig, (ax1, ax2, ax3) = plt.subplots(nrows=1, ncols=3, figsize=(10, 5)) >>> fig.subplots_adjust(left=0.1, right=0.95, top=0.95, bottom=0.15) >>> >>> fig.suptitle('Logistic Layer:
- loss({0:.3f})’.format(layer_loss))
>>> >>> ax1.imshow(float_2_img(inpt[0])) >>> ax1.axis('off') >>> ax1.set_title('Original Image') >>> >>> ax2.imshow(float_2_img(forward_out[0])) >>> ax2.axis('off') >>> ax2.set_title('Forward Image') >>> >>> ax3.imshow(float_2_img(delta[0])) >>> ax3.axis('off') >>> ax3.set_title('Delta Image') >>> >>> fig.tight_layout() >>> plt.show()
TODO
- backward(delta=None)[source]
Backward function of the Logistic Layer
- Parameters
delta (array-like (default = None)) – delta array of shape (batch, w, h, c). Global delta to be backpropagated.
- Return type
self
- forward(inpt, truth=None)[source]
Forward function of the logistic layer
- Parameters
inpt (array-like) – Input batch of images in format (batch, in_w, in_h, in _c)
truth (array-like (default = None)) – truth values, it must have the same dimension as inpt. If None, the layer does not compute the cost, but simply tranform the input
- Return type
self
- property out_shape
Get the output shape
- Returns
out_shape – Tuple as (batch, out_w, out_h, out_c)
- Return type
tuple