https://github.com/aymericdamien/Tensor ... ceptron.py
http://www.jessicayung.com/explaining-t ... erceptron/
http://www.juergenwiki.de/notes/deep_le ... ptron.html
http://students.washington.edu/adelak/2017/04/?p=350
TensorFlow Multilayer perceptron
- Antonio Linares
- Site Admin
- Posts: 37481
- Joined: Thu Oct 06, 2005 5:47 pm
- Location: Spain
- Contact:
- Antonio Linares
- Site Admin
- Posts: 37481
- Joined: Thu Oct 06, 2005 5:47 pm
- Location: Spain
- Contact:
Re: TensorFlow Multilayer perceptron
Classifying Text with Neural Networks and TensorFlow
https://medium.com/@Synced/big-picture- ... 3358625601
https://medium.com/@Synced/big-picture- ... 3358625601
https://github.com/dmesquita/understand ... sorflow_nnFirst, create an index for each word. Then, create a matrix for each text, in which the values are 1 if a word is in the text and 0 otherwise
- Antonio Linares
- Site Admin
- Posts: 37481
- Joined: Thu Oct 06, 2005 5:47 pm
- Location: Spain
- Contact:
Re: TensorFlow Multilayer perceptron
Creating a bitmap from a text:
https://github.com/dmesquita/understand ... sorflow_nn
Turned into Harbour code:
https://github.com/dmesquita/understand ... sorflow_nn
Turned into Harbour code:
Code: Select all
#include "FiveWin.ch"
function Main()
local hVocabulary := hb_Hash(), hWordToIndex
local cText := "Hi from Brazil"
local cWord, aMatrix
for each cWord in TextSplit( cText )
if hb_HHasKey( hVocabulary, Lower( cWord ) )
hVocabulary[ Lower( cWord ) ] += 1
else
hVocabulary[ Lower( cWord ) ] = 1
endif
next
XBrowser( hVocabulary )
XBrowser( hWordToIndex := WordToIndex( hVocabulary ) )
aMatrix = Array( Len( hVocabulary ) )
AEval( aMatrix, { | n, i | aMatrix[ i ] := 0 } )
for each cWord in TextSplit( cText )
aMatrix[ hWordToIndex[ Lower( cWord ) ] ] += 1
next
XBrowser( aMatrix )
return nil
function TextSplit( cText )
local n, aTokens := {}
for n = 1 to NumToken( cText )
AAdd( aTokens, Token( cText,, n ) )
next
return aTokens
function WordToIndex( hVocabulary )
local hWordToIndex := hb_Hash()
local n
for n = 1 to Len( hVocabulary )
hWordToIndex[ hb_HKeyAt( hVocabulary, n ) ] = n
next
return hWordToIndex
- Antonio Linares
- Site Admin
- Posts: 37481
- Joined: Thu Oct 06, 2005 5:47 pm
- Location: Spain
- Contact:
Re: TensorFlow Multilayer perceptron
Meet the Robot Writing ‘Friends’ Sequels
After I read this:
http://www.thedailybeast.com/meet-the-r ... ds-sequels
I got quite curious to understand how he did it:
Imagine a neuronal network learning from already existing code and writting the next code for you
I appreciate if you share your ideas about it
After I read this:
http://www.thedailybeast.com/meet-the-r ... ds-sequels
I got quite curious to understand how he did it:
This seems different to what I previously posted on this thread, but it seems clear that words (and text) must be turned into a bitmap so the neuronal network can process them.“It works by predicting the next letter to follow a given sequence of letters, and the predictions are determined by what it learned about language from the Friends dialogue provided,”
Imagine a neuronal network learning from already existing code and writting the next code for you
I appreciate if you share your ideas about it