Skip to content

Latest commit

 

History

History
13 lines (9 loc) · 880 Bytes

File metadata and controls

13 lines (9 loc) · 880 Bytes

Distributed training

This folder contains code for training the Inclusive classifier with tf.keras in distributed mode. tf.distribute strategy with MultiWorkerMirroredStrategy is used to parallelize the training. tf.data is used to read the data in TFRecord format.

Note: see also Training_TFKeras_CPU_GPU_K8S_Distributed for distributed training on Kubernets clusters, using the custom TF-Spawner tool