Awesome 社區模塊

Synaptic Build Status Join the chat at https://synapticjs.slack.com

Important:

Synaptic is a javascript neural network library for

This library includes a few built-in architectures like

The algorithm implemented by this library has been taken from Derek D. Monner's paper:

A generalized LSTM-like training algorithm for second-order recurrent neural networks

There are references to the equations in that paper commented through the source code.

Introduction

If you have no prior knowledge about Neural Networks, you should start by

If you want a practical example on how to feed data to a neural network, then take a look at

You may also want to take a look at

Demos

The source code of these demos can be found in

Getting started

To try out the examples, checkout the

git checkout gh-pages

Other languages

This README is also available in other languages.

Overview

Installation

In node

You can install synaptic with

1
npm install synaptic --save
In the browser

You can install synaptic with

1
bower install synaptic

Or you can simply use the CDN link, kindly provided by

1
<script src="https://cdnjs.cloudflare.com/ajax/libs/synaptic/1.1.4/synaptic.js"></script>

Usage

1 2 3 4 5 6
var synaptic = require('synaptic'); // this line is not needed in the browser var Neuron = synaptic.Neuron, Layer = synaptic.Layer, Network = synaptic.Network, Trainer = synaptic.Trainer, Architect = synaptic.Architect;

Now you can start to create networks, train them, or use built-in networks from the

Examples

Perceptron

This is how you can create a simple

perceptron.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22
function Perceptron(input, hidden, output) { // create the layers var inputLayer = new Layer(input); var hiddenLayer = new Layer(hidden); var outputLayer = new Layer(output); // connect the layers inputLayer.project(hiddenLayer); hiddenLayer.project(outputLayer); // set the layers this.set({ input: inputLayer, hidden: [hiddenLayer], output: outputLayer }); } // extend the prototype chain Perceptron.prototype = new Network(); Perceptron.prototype.constructor = Perceptron;

Now you can test your new network by creating a trainer and teaching the perceptron to learn an XOR

1 2 3 4 5 6 7 8 9
var myPerceptron = new Perceptron(2,3,1); var myTrainer = new Trainer(myPerceptron); myTrainer.XOR(); // { error: 0.004998819355993572, iterations: 21871, time: 356 } myPerceptron.activate([0,0]); // 0.0268581547421616 myPerceptron.activate([1,0]); // 0.9829673642853368 myPerceptron.activate([0,1]); // 0.9831714267395621 myPerceptron.activate([1,1]); // 0.02128894618097928
Long Short-Term Memory

This is how you can create a simple

long short-term memory

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46
function LSTM(input, blocks, output) { // create the layers var inputLayer = new Layer(input); var inputGate = new Layer(blocks); var forgetGate = new Layer(blocks); var memoryCell = new Layer(blocks); var outputGate = new Layer(blocks); var outputLayer = new Layer(output); // connections from input layer var input = inputLayer.project(memoryCell); inputLayer.project(inputGate); inputLayer.project(forgetGate); inputLayer.project(outputGate); // connections from memory cell var output = memoryCell.project(outputLayer); // self-connection var self = memoryCell.project(memoryCell); // peepholes memoryCell.project(inputGate); memoryCell.project(forgetGate); memoryCell.project(outputGate); // gates inputGate.gate(input, Layer.gateType.INPUT); forgetGate.gate(self, Layer.gateType.ONE_TO_ONE); outputGate.gate(output, Layer.gateType.OUTPUT); // input to output direct connection inputLayer.project(outputLayer); // set the layers of the neural network this.set({ input: inputLayer, hidden: [inputGate, forgetGate, memoryCell, outputGate], output: outputLayer }); } // extend the prototype chain LSTM.prototype = new Network(); LSTM.prototype.constructor = LSTM;

These are examples for explanatory purposes, the

Contribute

Synaptic

If you want to contribute feel free to send PR's, just make sure to run

Support

If you like this project and you want to show your support, you can buy me a beer with

1 2 3 4
BTC: 16ePagGBbHfm2d6esjMXcUBTNgqpnLWNeK ETH: 0xa423bfe9db2dc125dd3b56f215e09658491cc556 LTC: LeeemeZj6YL6pkTTtEGHFD6idDxHBF2HXa XMR: 46WNbmwXpYxiBpkbHjAgjC65cyzAxtaaBQjcGpAZquhBKw2r8NtPQniEgMJcwFMCZzSBrEJtmPsTR54MoGBDbjTi2W1XmgM

<3