Just a comment on your code. It is very inefficient to open a new FTP connection every time you call getRecursiveDirListing(), depending on the server it can add several seconds of delay per directory listing. It is A LOT faster if you open the FTP socket before calling getRecursiveDirListing() and leave it open the whole time. If you try this code on a remote server instead of your localhost server you'd see the increase in speed.

 use Net::FTP;
 use File::Listing qw(parse_dir);

 sub getRecursiveDirListing
      # create an array to hold directories, it should be a local variable
      local @dirs = ();

      # directory parameter passed to the sub-routine
      my $dir = $_[0];

      # if the directory was passed onto the sub-routin, change the remote directory
      $ftp->cwd($dir) if($dir);

      # get the file listing
      @ls = $ftp->ls('-lR');

      # the current working directory on the remote server
      my $cur_dir = $ftp->pwd();

      # parse and loop through the directory listing
      foreach my $file (parse_dir(\@ls))
          my($name, $type, $size, $mtime, $mode) = @$file;

          next if($type ne 'd'); # ignore current entry if not a directory and continue

          push(@dirs,"$cur_dir/$name"); # push the directory into the array

          # recursive call to get the entries in the entry, and get an array of return values
          @xx = getRecursiveDirListing ("$cur_dir/$name");

      # merge the array returned from the recursive call with the current directory listing
      return (@dirs,@xx);
Usage being:
print("Connecting to FTP ... ");
# create a new instance of the FTP connection
$ftp = Net::FTP->new("ftp.hp.com", Debug=>0) or die("Cannot connect $!");

# login to the server
$ftp->login("anonymous","anonymous") or die("Login failed $!");
print(" Logged in ... \n");
@y = getRecursiveDirListing ("/pub/softlib/");
print join("\n",@y);

# close the FTP connection
Thanks for the code, it helped me to get started. I usually don't program in perl

Thanks again.