Jake Worth

Neural Networking for Complete Sentences

Published: May 19, 2019 2 min read

This week I wrote a bit of neural networking code for the first time. It utilizes Brain.js to try and recognize if a sentence is grammatically complete.

Here’s my code.

const net = new brain.recurrent.LSTM()

  { input: "Hello, I'm John Walker.", output: 'complete' },
  { input: 'This is on you!', output: 'complete' },
  { input: 'John kik', output: 'incomplete' },
  { input: 'This is', output: 'incomplete' },
  { input: 'Great job.', output: 'complete' },
  { input: 'When I hear a', output: 'incomplete' },

What’s going on here? First, I instantiate an instance of Brain’s LSTM (Long Short-Term Memory). Then, I train it on a collection of sentences, telling the system if each is complete or incomplete. Even six examples is computationally expensive on a new iMac.

Here’s the output:

> net.run("I'm Stil.");
> net.run("Great job!")

It works for these examples, and fails for others. Why? Too small a dataset certainly. My solution was mostly a bit of hacking to answer this Stack Overflow question:

Brain js NaN

I’d like to explore neural networking in the future, when there’s a practical application driving me toward it.

✉️ Get better at programming by learning with me. Subscribe to Jake Worth's Newsletter for bi-weekly ideas, creations, and curated resources from across the world of programming. Join me today!

Blog of Jake Worth, software engineer in Maine.

© 2022 Jake Worth.