Bash – argument list too long

I want to to scp from a remote server to my local computer a large folder with about 115k json files.

Let’s say that I use something like this:

scp username@remote:/remote_path/*.json /local_path/

However when I try to do this I get the response:

argument list too long

How can I transfer the files then?

Asked By: Outcast


Consider transferring the whole directory instead of individual files:

scp -r username@remote:/remote_path /local_path/

If that would transfer too much and you really only want to transfer the files whose names ends with .json in the single directory, you may want to consider rsync (which has better facilities for filtering what gets transferred):

rsync -av --include='*.json' --exclude='*' username@remote:/remote_path/ /local_path/

This would only copy files whose names end in .json but ignore other names. The terminating / on the source is needed here.

The -a option makes the transfer also transfer file meta data (timestamps, basically) and makes rsync recurse down into subdirectories (but this is restricted by --exclude above), while -v is for verbose operation.

A third option would be to create a tar archive of the remote directory, or at least the files that you’d want to transfer, and then scp that archive over to the local system. In fact, that could be done in one go with ssh, simulating scp -r:

ssh username@remote 'tar -c -f - -C /remote_path .' | tar -x -f - -C /local_path
Answered By: Kusalananda
Categories: Answers Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.