On Thu, Sep 20, 2012 at 11:45:52PM +0200, John Tytgat wrote:
> Could very well be a bug in that particular version (or Debian patches
> if any). Have you tried mining the wget ML/IRC and/or manual mentioned
> at www.gnu.org/software/wget/ ?
Can you give a specific command line that's failing? I just used the
autobuilder to download source to Debian's wget 1.13.4-3 and built it on
Linux, trying:
$ src/wget http://www.somethi.ng/valid.html
--2012-09-21 17:44:01-- http://www.somethi.ng/valid.html
Resolving www.somethi.ng (www.somethi.ng)... failed: Name or service not known.
wget: unable to resolve host address `www.somethi.ng'
$ src/wget -nv http://www.somethi.ng/valid.html
wget: unable to resolve host address `www.somethi.ng'
$ src/wget -nv http://www.somethi.ng/valid.html
wget: unable to resolve host address `www.somethi.ng'
$ src/wget -nv http://www.somethi.ng/valid.html
wget: unable to resolve host address `www.somethi.ng'
$ src/wget -nv http://not.val.id/at.all
wget: unable to resolve host address `not.val.id'
$ src/wget -nv http://not.val.id/at.all
wget: unable to resolve host address `not.val.id'
$ src/wget -nv http://not.val.id/at.all
wget: unable to resolve host address `not.val.id'
$ src/wget -nv http://not.val.id/at.all
wget: unable to resolve host address `not.val.id'
$ src/wget -nv --spider http://not.val.id/at.all
wget: unable to resolve host address `not.val.id'
$ src/wget -nv --spider http://not.val.id/at.all
wget: unable to resolve host address `not.val.id'
$ src/wget -nv --spider http://www.somethi.ng/valid.html
wget: unable to resolve host address `www.somethi.ng'
$ src/wget -nv --spider http://www.somethi.ng/valid.html
wget: unable to resolve host address `www.somethi.ng'
$ src/wget -nv --spider http://www.somethi.ng/valid.html
wget: unable to resolve host address `www.somethi.ng'
$ src/wget -nv --spider http://www.somethi.ng/valid.html
wget: unable to resolve host address `www.somethi.ng'
I'm puzzled that it got '200 OK', because that's being returned by the
server. You don't have any kind of proxy in the way - for example your ISP
isn't doing transparent proxying or deep packet inspection or anything?
Try
*wget -v --save-headers http://www.riscos.com
and look at the index/html file. I get:
HTTP/1.1 200 OK
Date: Fri, 21 Sep 2012 16:52:22 GMT
Server: Apache/2
Last-Modified: Mon, 09 Jan 2012 17:43:58 GMT
ETag: "2fb3-4b61bf1721380"
Accept-Ranges: bytes
Content-Length: 12211
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Content-Type: text/html
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
[etc]
Do you get any more headers than that?
Theo
_______________________________________________
GCCSDK mailing list gcc@gccsdk.riscos.info
Bugzilla: http://www.riscos.info/bugzilla/index.cgi
List Info: http://www.riscos.info/mailman/listinfo/gcc
Main Page: http://www.riscos.info/index.php/GCCSDK
No comments:
Post a Comment