我正在尝试使用 Keras 训练卷积神经网络,以识别有关烹饪的 Stack Exchange 问题的标签。
我的数据集的第 i 个问题元素是这样的:
id 2
title How should I cook bacon in an oven?
content <p>I've heard of people cooking bacon in an ov...
tags oven cooking-time bacon
Name: 1, dtype: object
我已经删除了 BeautifulSoup 的标签,也删除了标点符号。由于问题的内容非常大,我决定专注于标题。我使用 sklearn CountVectorizer 对标题中的单词进行矢量化。然而,它们超过了 8000 个单词(不包括停用词)。所以我决定应用词性标记并只检索名词和动名词。
from sklearn.feature_extraction.text import CountVectorizer
vectorizer = CountVectorizer(stop_words='english')
titles = dataframes['cooking']['title']
pos_titles = []
for i,title in enumerate(titles):
pos = []
pt_titl = nltk.pos_tag(word_tokenize(title))
for pt in pt_titl:
if pt[1]=='NN' or pt[1]=='NNS' or pt[1]=='VBG':# or pt[1]=='VBP' or pt[1]=='VBS':
pos.append(pt[0])
pos_titles.append(" ".join(pos))
这代表我的输入向量。我也有矢量化标签,并为输入和标签提取密集矩阵。
tags = [" ".join(x) for x in dataframes['cooking']['tags']]
Xd = X.todense()
Y = vectorizer.fit_transform(tags)
Yd = Y.todense()
将数据拆分为训练集和验证集
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(Xd, Yd, test_size=0.33, random_state=42)
现在我正在尝试训练一个 Conv1D 网络
from keras.models import Sequential
from keras.layers import Dense, Activation,Flatten
from keras.layers import Conv2D, MaxPooling2D,Conv1D, Embedding,GlobalMaxPooling1D,Dropout,MaxPooling1D
model = Sequential()
model.add(Embedding(Xd.shape[1],
128,
input_length=Xd.shape[1]))
model.add(Conv1D(32,5,activation='relu'))
model.add(MaxPooling1D(100,stride=50))
model.add(Conv1D(32,5,activation='relu'))
model.add(GlobalMaxPooling1D())
model.add(Dense(Yd.shape[1], activation ='softmax'))
model.compile(optimizer='rmsprop',
loss='categorical_crossentropy',
metrics=['accuracy'])
model.fit(X_train, y_train, batch_size=32,verbose=1)
但它被困在一个非常低的准确度上,并且随着时间的推移,它显示出几乎没有增加的损失
Epoch 1/10
10320/10320 [==============================] - 401s - loss: 15.8098 - acc: 0.0604
Epoch 2/10
10320/10320 [==============================] - 339s - loss: 15.5671 - acc: 0.0577
Epoch 3/10
10320/10320 [==============================] - 314s - loss: 15.5509 - acc: 0.0578
Epoch 4/10
10320/10320 [==============================] - 34953s - loss: 15.5493 - acc: 0.0578
Epoch 5/10
10320/10320 [==============================] - 323s - loss: 15.5587 - acc: 0.0578
Epoch 6/10
6272/10320 [=================>............] - ETA: 133s - loss: 15.6005 - acc: 0.0550