如何像 Tensorflow 那样在一个时代中的每个步骤中获得 Keras 的准确性?

数据挖掘 深度学习 喀拉斯 张量流
2022-03-05 08:44:30

就像在张量流中一样,我每一步都获得了准确性-

Step 1, Minibatch Loss= 68458.3359, Training Accuracy= 0.800
Step 10, Minibatch Loss= 451470.3125, Training Accuracy= 0.200
Step 20, Minibatch Loss= 582661.1875, Training Accuracy= 0.200
Step 30, Minibatch Loss= 186046.3125, Training Accuracy= 0.400
Step 1, Minibatch Loss= 161546.6250, Training Accuracy= 0.600
Step 10, Minibatch Loss= 286965.3125, Training Accuracy= 0.400
Step 20, Minibatch Loss= 205545.7500, Training Accuracy= 0.600
Step 30, Minibatch Loss= 202164.6562, Training Accuracy= 0.800
Step 1, Minibatch Loss= 214717.7969, Training Accuracy= 0.600
Step 10, Minibatch Loss= 108088.7344, Training Accuracy= 0.800
Step 20, Minibatch Loss= 80130.6016, Training Accuracy= 0.800
Step 30, Minibatch Loss= 28674.1875, Training Accuracy= 0.800
Step 1, Minibatch Loss= 78675.6641, Training Accuracy= 0.400
Step 10, Minibatch Loss= 168231.2812, Training Accuracy= 0.600
Step 20, Minibatch Loss= 77828.1406, Training Accuracy= 0.600
Step 30, Minibatch Loss= 56584.9609, Training Accuracy= 0.800
Step 1, Minibatch Loss= 29474.0898, Training Accuracy= 0.600
Step 10, Minibatch Loss= 79742.9531, Training Accuracy= 0.800
Step 20, Minibatch Loss= 0.0000, Training Accuracy= 1.000
Step 30, Minibatch Loss= 6736.4688, Training Accuracy= 0.800

但是在 keras 中,我得到了每个时代的准确性-

156/156 [==============================] - 6s 39ms/step - loss: 13.0185 - acc: 0.1923 
Epoch 2/10
156/156 [==============================] - 3s 18ms/step - loss: 12.9151 - acc: 0.1987
Epoch 3/10
156/156 [==============================] - 3s 18ms/step - loss: 13.1218 - acc: 0.1859
Epoch 4/10
156/156 [==============================] - 3s 18ms/step - loss: 12.9151 - acc: 0.1987
Epoch 5/10
156/156 [==============================] - 3s 18ms/step - loss: 13.1218 - acc: 0.1859
Epoch 6/10
156/156 [==============================] - 3s 18ms/step - loss: 12.9151 - acc: 0.1987
Epoch 7/10
156/156 [==============================] - 3s 18ms/step - loss: 12.8118 - acc: 0.2051
Epoch 8/10
156/156 [==============================] - 3s 18ms/step - loss: 12.8118 - acc: 0.2051
Epoch 9/10
156/156 [==============================] - 3s 18ms/step - loss: 12.8118 - acc: 0.2051
Epoch 10/10
156/156 [==============================] - 3s 18ms/step - loss: 12.9151 - acc: 0.1987
1个回答

创建自定义CallBack

from tensorflow.keras.callbacks import Callback

class NBatchLogger(Callback):
    "A Logger that log average performance per `display` steps."

    def __init__(self, display):
        self.step = 0
        self.display = display
        self.metric_cache = {}

    def on_batch_end(self, batch, logs={}):
        self.step += 1
        for k in self.params['metrics']:
            if k in logs:
                self.metric_cache[k] = self.metric_cache.get(k, 0) + logs[k]
        if self.step % self.display == 0:
            metrics_log = ''
            for (k, v) in self.metric_cache.items():
                val = v / self.display
                if abs(val) > 1e-3:
                    metrics_log += ' - %s: %.4f' % (k, val)
                else:
                    metrics_log += ' - %s: %.4e' % (k, val)
            print(f'step: {self.step}/{self.params['steps']} ... {metrics_log}')       
            self.metric_cache.clear()

代码的灵感就在这里