You are not logged in.
Pages: 1
I'm writting a script to grab a webpage from my university and see if it changes like every ten minutes.
My problem is fetching the webpage itself. I'm trying to use wget... I have tried --user and --password and loading cookies from firefox and opera, and combinations of both user details and cookies, but I pretty much allways get a 401 - Unauthorized....
How can I detect which authorization method is used?
The page is this one:
http://elearning.ua.pt/bbcswebdav/courses/42545/Site/index.htm
I don't have access lower in the directory tree, though I can list a few files in ...45/Site.
I can also provide a generic login, if you think that helps...
Offline
did you try changing the user agent? wget --user-agent=firefox ...
Offline
For some reason, I cannot connect to the server mentioned -- seems your university has turned its DNSs off for weekend Anyway, try -S option to wget, you shall see HTTP response headers with WWW-Authenticate: among them.
Offline
did you try changing the user agent? wget --user-agent=firefox ...
i did to MSIE7 or something like that, but didn't bother anymore...
I will check those headers and post them here...
and there is something wrong with the server today, indeed....
Offline
it's a basic authentication...
I did it now.
I'm guessing it was a bash problem rather than wget or servers... Before i wrote the user and password fields in the command with --user=name@ua.pt and same for password. My password, though, had some special characters which I had to escape... wget was probably messing with them somehow.... Now i assigned them to a 'variable' and used it's content...
It is now working.
Thanks everyone.
Last edited by nDray (2008-01-19 15:50:40)
Offline
Pages: 1