代码拉取完成,页面将自动刷新
同步操作将从 xchu2020/HiDDeN 强制同步,此操作会覆盖自 Fork 仓库以来所做的任何修改,且无法恢复!!!
确定后同步将在后台操作,完成时将刷新页面,请耐心等待。
import numpy as np
import tensorboardX
class TensorBoardLogger:
"""
Wrapper class for easy TensorboardX logging
"""
def __init__(self, log_dir):
self.grads = {}
self.tensors = {}
self.writer = tensorboardX.SummaryWriter(log_dir)
def grad_hook_by_name(self, grad_name):
def backprop_hook(grad):
self.grads[grad_name] = grad
return backprop_hook
def save_losses(self, losses_accu: dict, epoch: int):
for loss_name, loss_value in losses_accu.items():
self.writer.add_scalar('losses/{}'.format(loss_name.strip()), loss_value.avg, global_step=epoch)
def save_grads(self, epoch: int):
for grad_name, grad_values in self.grads.items():
self.writer.add_histogram(grad_name, grad_values, global_step=epoch)
def add_tensor(self, name: str, tensor):
self.tensors[name] = tensor
def save_tensors(self, epoch: int):
for tensor_name, tensor_value in self.tensors.items():
self.writer.add_histogram('tensor/{}'.format(tensor_name), tensor_value, global_step=epoch)
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。