Skip to content

Releases: konas122/DaZero

v1.1

12 Jan 12:33
Compare
Choose a tag to compare
update `MNIST` url

Add Transformers

04 Mar 07:51
Compare
Choose a tag to compare

Add SelfAttention and TransformerBlock

v0.2

21 Feb 15:14
Compare
Choose a tag to compare

Add LayerNorm function.

Example1

import numpy as np

import dazero.functions as F
from dazero import Model, Parameter


inputs = np.random.rand(100, 100, 30, 30).astype(np.float64)
normalized_shape = (100, 30, 30)
x = Parameter(inputs)
output = F.layer_norm(x, normalized_shape)

Example2

import numpy as np

import dazero.layers as L
from dazero import Model, Parameter


class Net(Model):
    def __init__(self, normalized_shape, gamma=None, beta=None):
        super().__init__()
        self.layer = L.LayerNorm(normalized_shape, gamma=gamma, beta=beta)
    
    def forward(self, inputs):
        return self.layer(inputs)


inputs = np.random.rand(100, 100, 30, 30).astype(np.float64)
normalized_shape = (100, 30, 30)
x = Parameter(inputs)
Layernorm = Net(normalized_shape)
output = Layernorm(x)
output.backward()

v0.1

19 Feb 12:29
Compare
Choose a tag to compare

Initial version of DaZero.