Blog

Developers

Eric Tompkins
Dec 5, 2016

Bulk Download Files From the Command Line When Redesigning a Website

Sometimes when you are redesigning a website you don't want to copy over all of the files from the old website to the new website. Often there is a lot of content and structure changes and it's undesirable to have to deal with legacy files.

However, sometimes you are copying existing content over from the old site to the new site and you want to have an easy way to retrieve only the files that you need. 

Here is what we do in this scenario.

1) Copy over your content but don't copy over the documents or files yet. 

2) Use Screaming Frog SEO Spider (or something like it) to find any broken links.

3) Export the broken links as a CSV file and open that up in Excel. Delete any rows that are not for a document or image.

4) Delete all the columns except for the one that has the path to the files.

5) Delete all extra rows.

6) Make sure that the domain name for each file is for the old site and not the new one.

7) Save as a plain text file.

8) Open up your console/terminal and navigate to the folder where you saved the text file. Ideally it's in a folder that only contains the text file so that it's easier to know which new files were downloaded.

9) Run: wget -i download.txt

10) Upload the files that you downloaded.

Sign up for our newsletter to receive invaluable information about BranchCMS, web design & development.