N
N
nazandr2016-11-08 02:41:15
Python
nazandr, 2016-11-08 02:41:15

Why don't the loss and acc indicators change in the grid?

# -*- coding: utf-8 -*-
#!/usr/bin/env python3
import csv
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
from keras.layers import LSTM
from keras.callbacks import ModelCheckpoint
import numpy as np

f = open('/Users/andrey/Projects/News-parser/vocab.txt').read().lower()
csvfile = open('/Users/andrey/Projects/News-parser/habr.csv')
dataSet = csv.DictReader(csvfile)

chars = sorted(list(set(f)))
char_to_int = dict((c, i) for i, c in enumerate(chars))
int_to_char = dict((i, c) for i, c in enumerate(chars))
# summarize the loaded data
n_chars = len(f)
n_vocab = len(chars)

xTrain = []
yTrain = []
xTest = []
yTest = []

for row in dataSet:
  a = row['title'].lower()
  while len(a) < 150:
    a += ' '
  xTrain.append([char_to_int[char] for char in a])
  yTrain.append(int(row['rate']))

xTrain = np.reshape(xTrain, (len(xTrain),150))
xTrain = xTrain / float(n_vocab)
print (xTrain)
yTrain = np.reshape(yTrain, (len(yTrain),1))


model = Sequential()
model.add(Dense(50, input_dim=150, activation='sigmoid'))
model.add(Dropout(0.2))
model.add(Dense(1, activation='softmax'))
model.compile(optimizer='rmsprop',
              loss='binary_crossentropy',
              metrics=['accuracy'])
print ('Start model training...')
model.fit(xTrain, yTrain, nb_epoch=20, batch_size=128, shuffle=True)

Start model training...
Epoch 1/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 2/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 3/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 4/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 5/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 6/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 7/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 8/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 9/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 10/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 11/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 12/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 13/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 14/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 15/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 16/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 17/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 18/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 19/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923     
Epoch 20/20
130/130 [==============================] - 0s - loss: 9.6881 - acc: 0.3923

Answer the question

In order to leave comments, you need to log in

2 answer(s)
I
ivodopyanov, 2016-11-08
@ivodopyanov

Too little data
130 examples

O
olegas5, 2017-06-07
@olegas5

Not necessarily, it differs can begin with the fifth, sixth, seventh decimal place.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question