Subject: Re: ftp(1) causing lots of `400 Bad Request'
To: Todd Vierling <tv@pobox.com>
From: Luke Mewburn <lukem@cs.rmit.edu.au>
List: tech-userlevel
Date: 04/27/1999 11:21:58
Todd Vierling writes:
> Some of the more recent updates to ftp(1) - say, after 1.3G or so - have
> broken HTTP fetches from many systems.  As a prime example, try:
> 
> ftp http://www.engelschall.com/sw/mod_ssl/distrib/mod_ssl-2.2.8-1.3.6.tar.gz 
> 
> This is a perfectly valid URL, on an Apache server, yet it yields `400 Bad
> Request'.  Luke?  :)

This was fixed on 1999/03/22 in:
	/usr/src/usr.bin/ftp/fetch.c 1.52

The commit message was:

====
revision 1.52
date: 1999/03/22 07:36:40;  author: lukem;  state: Exp;  lines: +94 -82
* implement -R; restart non-proxied command-line FTP xfers
* fix fetch_ftp() so that hcode parsing is not done for file:// urls
  (a } in the wrong place, and code at the wrong indent level...)
* change outfile to being a global (so it gets correctly reset)
* change parse_url to not remove leading '/' for non ftp urls.
  whilst this is not totally rfc1738 compliant, other code kinda
  assumes this is the case, and it doesn't hurt
====

``update your source'' ;-)

BTW: whilst testing this, I got bitten by the same bug. I.e, I hadn't
done a make install in usr.bin/ftp either, even though I had fixed the
code, so /usr/bin/ftp didn't work. Took a few minutes to track *that*
one down.

I also had a quick check, and 1.4 does have the correct version
already.