Wget best practices on large files -
i have text file (150gb) in size, around 2.5 billion links need download.
i wondering if wget file, if wget puts entire files contents in memory, , if potentially crash machine.
i interested know on best practices when downloading amount of links, example, directory downloads become unusable amount of files?
what resuming, wget check top download stopped or failed, cause take long time.
does keep log? log file big? log file stored in memory before saved somewhere? can disable log.
basically, interested in best practice in downloading file of size. each link downloading around 250 bytes.
many thanks
Comments
Post a Comment