1 Star 0 Fork 0

WZY99/GPNR代码分析

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
main.py 2.58 KB
一键复制 编辑 原始数据 按行查看 历史
WZY99 提交于 2022-09-20 22:03 . Init.
# coding=utf-8
# Copyright 2022 The Google Research Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Main file for running the example."""
from absl import app
from absl import flags
from absl import logging
# Required import to setup work units when running through XManager.
from clu import platform
import jax
from ml_collections import config_flags
import tensorflow as tf
from . import eval_lib
from . import train_lib
FLAGS = flags.FLAGS
config_flags.DEFINE_config_file(
"ml_config", None, "Training configuration.", lock_config=True)
flags.DEFINE_string("workdir", None, "Work unit directory.")
flags.DEFINE_bool("is_train", None, "If true, run train else eval")
flags.mark_flags_as_required(["ml_config", "workdir", "is_train"])
# Flags --jax_backend_target and --jax_xla_backend are available through JAX.
def main(argv):
del argv
# Hide any GPUs form TensorFlow. Otherwise TF might reserve memory and make
# it unavailable to JAX.
tf.config.experimental.set_visible_devices([], "GPU")
if FLAGS.jax_backend_target:
logging.info("Using JAX backend target %s", FLAGS.jax_backend_target)
jax_xla_backend = ("None" if FLAGS.jax_xla_backend is None else
FLAGS.jax_xla_backend)
logging.info("Using JAX XLA backend %s", jax_xla_backend)
logging.info("JAX process: %d / %d", jax.process_index(), jax.process_count())
logging.info("JAX devices: %r", jax.devices())
if FLAGS.is_train:
# Add a note so that we can tell which Borg task is which JAX host.
# (Borg task 0 is not guaranteed to be host 0)
platform.work_unit().set_task_status(
f"process_index: {jax.process_index()}, "
f"process_count: {jax.process_count()}")
platform.work_unit().create_artifact(platform.ArtifactType.DIRECTORY,
FLAGS.workdir, "workdir")
train_lib.train_and_evaluate(FLAGS.ml_config, FLAGS.workdir)
else:
eval_lib.evaluate(FLAGS.ml_config, FLAGS.workdir)
if __name__ == "__main__":
# Provide access to --jax_backend_target and --jax_xla_backend flags.
jax.config.config_with_absl()
app.run(main)
Loading...
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/wzy-99/GPNR-comments.git
git@gitee.com:wzy-99/GPNR-comments.git
wzy-99
GPNR-comments
GPNR代码分析
master

搜索帮助