Wget best practices on large files -


i have text file (150gb) in size, around 2.5 billion links need download.

i wondering if wget file, if wget puts entire files contents in memory, , if potentially crash machine.

i interested know on best practices when downloading amount of links, example, directory downloads become unusable amount of files?

what resuming, wget check top download stopped or failed, cause take long time.

does keep log? log file big? log file stored in memory before saved somewhere? can disable log.

basically, interested in best practice in downloading file of size. each link downloading around 250 bytes.

many thanks


Comments

Popular posts from this blog

python - No exponential form of the z-axis in matplotlib-3D-plots -

php - Best Light server (Linux + Web server + Database) for Raspberry Pi -

c# - "Newtonsoft.Json.JsonSerializationException unable to find constructor to use for types" error when deserializing class -