Quantcast
Channel: Tensorflow: feeding data to the graph - Stack Overflow
Viewing all articles
Browse latest Browse all 2

Answer by Rick Lentz for Tensorflow: feeding data to the graph

$
0
0

Yes. If the file size is small (e.g. single line of text) there will likely be many small host to device transfers.

Here is an example of using feed_dict with a custom fill_feed_dict function: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/tutorials/mnist/fully_connected_feed.py

For your use though, it may be easier to handle many small files using TensorFlow's QueueRunner. This will create a pool of reader threads to prefetch your data and help speed up data availability to the TensorFlow graph.

 Create the graph, etc.
init_op = tf.global_variables_initializer()

# Create a session for running operations in the Graph.
sess = tf.Session()

# Initialize the variables (like the epoch counter).
sess.run(init_op)

# Start input enqueue threads.
coord = tf.train.Coordinator()
threads = tf.train.start_queue_runners(sess=sess, coord=coord)

try:
    while not coord.should_stop():
        # Run training steps or whatever
        sess.run(train_op)

except tf.errors.OutOfRangeError:
    print('Done training -- epoch limit reached')
finally:
    # When done, ask the threads to stop.
    coord.request_stop()

# Wait for threads to finish.
coord.join(threads)
sess.close()

(https://www.tensorflow.org/how_tos/reading_data/#batching)


Viewing all articles
Browse latest Browse all 2

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>