Wget ошибка аутентификации пользователя пароля

I’ve tried both

wget --user=myuser --password=mypassword myfile

and

wget --ftp-user=myuser --ftp-password=mypassword myfile

but I keep getting the error

HTTP request sent, awaiting response... 401 Authorization Required
Authorization failed.

I know the file is there, and I know the username/password are correct — I can ftp in with no problem. Any thoughts on what’s going on here? How do I even tell if wget is paying attention to the username/password that I’m giving it? (The error is the same if I simply don’t provide that info.)

asked Feb 5, 2009 at 23:54

Jesse Beder's user avatar

Jesse BederJesse Beder

32.9k21 gold badges108 silver badges146 bronze badges

3

try wget —http-user=username —http-password=password http://….

answered Jun 25, 2009 at 7:11

1

Are you using an «ftp://» URL? From the error message it appears that you’re making a request for an «http://» URL.

answered Feb 5, 2009 at 23:58

Marc Novakowski's user avatar

Marc NovakowskiMarc Novakowski

44.4k11 gold badges58 silver badges63 bronze badges

0

One more comment:

Setting --user and --password sets the user/pw for both ftp and http requests, so that’s more general.

In my case, nothing worked, except using --ask-password

I was using a https URL.

answered Feb 3, 2014 at 15:11

Frederick's user avatar

FrederickFrederick

1,2711 gold badge10 silver badges29 bronze badges

it might be useful to add that if you need to add a domain name before that the backslash must be escaped i.e. «» is preceded with another «» e.g. «domain\username» similarly if the password has any chars that requires escaping (i supposed, havn’t tested it).

wget --http-user=domain\username --http-password=password http://...

CloudyMarble's user avatar

CloudyMarble

36.8k70 gold badges96 silver badges130 bronze badges

answered Jul 3, 2013 at 3:37

snugam's user avatar

Below is the command and response of wget, It first time does not use provided userName and password and get 401, later it uses the auth and gets 200.

This works well with curl but same thing happens via Postman as well, what is this phenomena and why it happens?

$> wget 'http://userName:password@host:port/v1/api'
--2018-08-31 16:06:01--  http://userName:password@host:port
Connecting to host:port... connected.
HTTP request sent, awaiting response... 401 Unauthorized
Authentication selected: Basic realm="myApp", API-Key realm="myApp"
Reusing existing connection to host:port.
HTTP request sent, awaiting response... 200 OK
Length: 146 [application/json]
Saving to: 'api'

api 100%[==================================================================================================================>]     146  --.-KB/s   in 0s

2018-08-31 16:06:01 (9.28 MB/s) - 'api' saved [146/146]

asked Aug 31, 2018 at 11:11

Saurabh's user avatar

wget and most other programs request a basic authentication challenge from the server side before sending the credentials.

This is wget‘s default behavior since version 1.10.2.
You can change that behaviour using --auth-no-challenge option:

If this option is given, Wget will send Basic HTTP authentication information (plaintext username and password) for all requests, just like Wget 1.10.2 and prior did by default.

Use of this option is not recommended, and is intended only to support some few obscure servers, which never send HTTP authentication challenges, but accept unsolicited auth info, say, in
addition to form-based authentication.


This is the general default workflow for HTTP Authentication:

enter image description here

Read more about HTTP Authentication.

answered Aug 31, 2018 at 11:42

pLumo's user avatar

pLumopLumo

26k2 gold badges57 silver badges87 bronze badges

1

I’ve tried both

wget --user=myuser --password=mypassword myfile

and

wget --ftp-user=myuser --ftp-password=mypassword myfile

but I keep getting the error

HTTP request sent, awaiting response... 401 Authorization Required
Authorization failed.

I know the file is there, and I know the username/password are correct — I can ftp in with no problem. Any thoughts on what’s going on here? How do I even tell if wget is paying attention to the username/password that I’m giving it? (The error is the same if I simply don’t provide that info.)

asked Feb 5, 2009 at 23:54

Jesse Beder's user avatar

Jesse BederJesse Beder

32.5k21 gold badges108 silver badges145 bronze badges

3

try wget —http-user=username —http-password=password http://….

answered Jun 25, 2009 at 7:11

1

Are you using an «ftp://» URL? From the error message it appears that you’re making a request for an «http://» URL.

answered Feb 5, 2009 at 23:58

Marc Novakowski's user avatar

Marc NovakowskiMarc Novakowski

44k11 gold badges58 silver badges62 bronze badges

0

One more comment:

Setting --user and --password sets the user/pw for both ftp and http requests, so that’s more general.

In my case, nothing worked, except using --ask-password

I was using a https URL.

answered Feb 3, 2014 at 15:11

Frederick's user avatar

FrederickFrederick

1,27110 silver badges29 bronze badges

it might be useful to add that if you need to add a domain name before that the backslash must be escaped i.e. «» is preceded with another «» e.g. «domainusername» similarly if the password has any chars that requires escaping (i supposed, havn’t tested it).

wget --http-user=domainusername --http-password=password http://...

CloudyMarble's user avatar

CloudyMarble

36.5k70 gold badges95 silver badges129 bronze badges

answered Jul 3, 2013 at 3:37

snugam's user avatar

Below is the command and response of wget, It first time does not use provided userName and password and get 401, later it uses the auth and gets 200.

This works well with curl but same thing happens via Postman as well, what is this phenomena and why it happens?

$> wget 'http://userName:password@host:port/v1/api'
--2018-08-31 16:06:01--  http://userName:password@host:port
Connecting to host:port... connected.
HTTP request sent, awaiting response... 401 Unauthorized
Authentication selected: Basic realm="myApp", API-Key realm="myApp"
Reusing existing connection to host:port.
HTTP request sent, awaiting response... 200 OK
Length: 146 [application/json]
Saving to: 'api'

api 100%[==================================================================================================================>]     146  --.-KB/s   in 0s

2018-08-31 16:06:01 (9.28 MB/s) - 'api' saved [146/146]

asked Aug 31, 2018 at 11:11

Saurabh's user avatar

wget and most other programs request a basic authentication challenge from the server side before sending the credentials.

This is wget‘s default behavior since version 1.10.2.
You can change that behaviour using --auth-no-challenge option:

If this option is given, Wget will send Basic HTTP authentication information (plaintext username and password) for all requests, just like Wget 1.10.2 and prior did by default.

Use of this option is not recommended, and is intended only to support some few obscure servers, which never send HTTP authentication challenges, but accept unsolicited auth info, say, in
addition to form-based authentication.


This is the general default workflow for HTTP Authentication:

enter image description here

Read more about HTTP Authentication.

answered Aug 31, 2018 at 11:42

pLumo's user avatar

pLumopLumo

25.4k2 gold badges56 silver badges85 bronze badges

1

I’m at a total loss as to why I’m getting Authorization failed messages when I try to perform basic HTTP auth using wget on an internal webpage.

I’ve been using this wget command to pull a csv file from an internal webpage for around 4 months on a daily basis without issues.

wget --user username --password "password" http://internalwebsite.domain.com/data.csv

The site is not SSL and does prompt for credentials when visiting it with a web browser. I’ve verified that the credentials I am using in the call work in a browser to download the CSV.

I’ve tried modifying the wget command by adding the following additional flags:

--auth-no-challenge

I also changed the —username to —http-user and the password to —http-passwd but those options do not work either.

When I attempt the connection with wget I get a 401 Unauthorized Authorization failed message.

The site is sharepoint,and unless something recently changed this month with IIS that disables calls from wget, I cannot figure out for the life of me what changed…

Any help would be appreciated.

I’m at a total loss as to why I’m getting Authorization failed messages when I try to perform basic HTTP auth using wget on an internal webpage.

I’ve been using this wget command to pull a csv file from an internal webpage for around 4 months on a daily basis without issues.

wget --user username --password "password" http://internalwebsite.domain.com/data.csv

The site is not SSL and does prompt for credentials when visiting it with a web browser. I’ve verified that the credentials I am using in the call work in a browser to download the CSV.

I’ve tried modifying the wget command by adding the following additional flags:

--auth-no-challenge

I also changed the —username to —http-user and the password to —http-passwd but those options do not work either.

When I attempt the connection with wget I get a 401 Unauthorized Authorization failed message.

The site is sharepoint,and unless something recently changed this month with IIS that disables calls from wget, I cannot figure out for the life of me what changed…

Any help would be appreciated.

First off, I am not a Linux guy, but I have to pretend to be one sometimes as a web developer.

So, we have a Red Hat server and I’m using wget in crontab to run some PHP scripts.

We’ve been doing this for some time now and it’s been working fine.

I tried to add another script using wget to run a PHP script behind HTTP authentication. However, despite the fact that the URL works fine and the username and password are correct, we are getting Connection Timed Out errors each time.

What might cause wget to work for unauthenticated URLs, but not authenticated ones?

I’ve tried —user=/—password=, —http-user=/—http-password and Username:Password@ in the URL and all three fail the same way.

Here’s the command in question:

[blahblah user]# wget -t 5 -O /dev/null 'http://Username:Password1!@test.example.com/sub/dir/file-name.php'
--2010-07-07 10:11:55--  http://Username:*password*@test.example.com/sub/dir/file-name.php
Resolving test.example.com... 000.000.000.000
Connecting to test.example.com|000.000.000.000|:80... failed: Connection timed out.
Retrying.

[Repeat ad nauseum]

Any thoughts? Again, wget works, the file with authentication works, but wget calling the file with authentication does not work.

UPDATE: Actually, I get the same timeout if I access the authenticated URL without authentication. Could that mean that Apache is rejecting wget requests for authentication outright? (I’m really treading into speculation territory here. I know almost nothing about Apache configuration.)

First off, I am not a Linux guy, but I have to pretend to be one sometimes as a web developer.

So, we have a Red Hat server and I’m using wget in crontab to run some PHP scripts.

We’ve been doing this for some time now and it’s been working fine.

I tried to add another script using wget to run a PHP script behind HTTP authentication. However, despite the fact that the URL works fine and the username and password are correct, we are getting Connection Timed Out errors each time.

What might cause wget to work for unauthenticated URLs, but not authenticated ones?

I’ve tried —user=/—password=, —http-user=/—http-password and Username:Password@ in the URL and all three fail the same way.

Here’s the command in question:

[blahblah user]# wget -t 5 -O /dev/null 'http://Username:Password1!@test.example.com/sub/dir/file-name.php'
--2010-07-07 10:11:55--  http://Username:*password*@test.example.com/sub/dir/file-name.php
Resolving test.example.com... 000.000.000.000
Connecting to test.example.com|000.000.000.000|:80... failed: Connection timed out.
Retrying.

[Repeat ad nauseum]

Any thoughts? Again, wget works, the file with authentication works, but wget calling the file with authentication does not work.

UPDATE: Actually, I get the same timeout if I access the authenticated URL without authentication. Could that mean that Apache is rejecting wget requests for authentication outright? (I’m really treading into speculation territory here. I know almost nothing about Apache configuration.)

Why are you playing around with wget? Better use some headless browser to automate this task.

What is a headless browser, you ask?

A headless browser is a web browser without a graphical user interface.
They provide automated control of a web page in an environment similar to popular web browsers, but are executed via a command line interface or using network communication.

Two popular headless browsers are phantomjs (javascript) and Ghost.py (python).

Solution using phantomjs

First you will need to install phantomjs. On Ubuntu based systems, you can install it using the package manager or you could build it from source from their home page.

sudo apt-get install phantomjs

After this you write javascript script and run it using phantomjs:

phantomjs script.js

That’s it.

Now, to learn how to implement it for your case, head over to its quickstart guide. As an example, to login to facebook automatically, and take a snapshot, one could use the gist provided here:

// This code login's to your facebook account and takes snap shot of it.
var page = require('webpage').create();
var fillLoginInfo = function(){
var frm = document.getElementById("login_form");
frm.elements["email"].value = 'your fb email/username';
frm.elements["pass"].value = 'password';
frm.submit();
}
page.onLoadFinished = function(){
if(page.title == "Welcome to Facebook - Log In, Sign Up or Learn More"){
page.evaluate(fillLoginInfo);
return;
}
else
page.render('./screens/some.png');
console.log("completed");
phantom.exit();
}
page.open('https://www.facebook.com/');

Look around the documentation to implement it for your specific case. If you face some troubles for your https website due to ssl errors, run your script like this:

phantomjs --ssl-protocol=any script.js

Solution using Ghost.py

To install Ghost.py, you will need pip:

sudo apt-get install python-pip   #On a Debian based system
sudo pip install Ghost.py

Now you have installed Ghost.py. Now, to use it inside a python script, just follow the documentation given in its home page. I’ve tried using Ghost.py on an https website but it somehow didn’t work for me. Do try it and see if it works.

UPDATE : GUI based solution

You can also use tools like Selenium to automate the login process and retrieve the information. It is pretty easy to use. You will just need to install a plugin for your browser from here. And then you can record your process and replay it later on.

Вопрос:

Как загрузить веб-страницу, требующую имя пользователя и пароль?

Например, я хочу загрузить эту страницу после ввода имени пользователя и пароля:

http://forum.ubuntu-it.org/index.php

Лучший ответ:

Попробуйте представленное здесь решение:

  • http://www.unix.com/shell-programming-scripting/131020-using-wget-curl-http-post-authentication.html

       # Log in to the server.  This can be done only once.
       wget --save-cookies cookies.txt 
        --post-data 'user=foo&password=bar' 
        http://server.com/auth.php
    
    
       # Now grab the page or pages we care about.
       wget --load-cookies cookies.txt 
        -p http://server.com/interesting/article.php
    

Примечание для других, которые могут наткнуться на это:

  • Вышеописанное позволяет пользователю вводить ручной вход на сайт, который имеет форму с двумя полями ввода: один с именем user и один с именем password
  • У формы есть атрибут action, установленный на http://server.com/auth.php
  • Форма не использует JavaScript
    • Хороший намек на то, что он использует JavaScript, – это атрибут onsubmit в элементе формы
    • Обратите внимание, что это далеко не единственный способ установить атрибуты – используя JavaScript в любом месте страницы, или любой из статически или динамически загружаемых файлов script может изменять элементы формы
  • Лучший способ увидеть это – загрузить страницу и провести проверку в реальном времени, например. Firebug для Firefox

Итак, если имена атрибутов и URL-адрес формы различны, вам необходимо соответствующим образом изменить параметры на первую команду wget.

Если он использует JavaScript, есть вероятность, что он не будет работать вообще – например. в случае примера веб-сайта OPs он использует хеширование JavaScript клиента, поэтому внешний вызов с использованием wget не устанавливает необходимые поля в форме (в случае сайта Ubuntu, hash_passwrd).

Ответ №1

Использование параметров:

--password=PASS
--user=USERNAME

т.е.: wget http://www.example.com --user=joe --password=schmoe

Вы также можете добавить параметр --auth-no-challenge в случае возникновения дополнительных проблем:

т.е.: wget http://www.example.com --user=joe --password=schmoe --auth-no-challenge

Ответ №2

Следующие команды wget должны позволить вам получать доступ к страницам на веб-сайте, для которого требуется имя пользователя и пароль:

wget http://username:password@example.org/url/
wget --http-user=user --http-password=password http://example.org/url/

Ответ №3

Возможно, это поможет. На сайте, на котором я пытался войти, были некоторые скрытые поля, которые мне нужно было получить, прежде чем я смог успешно войти в систему. Таким образом, первый wget получает страницу входа в систему, чтобы найти дополнительные поля, вторую учетную запись wget на сайте и сохраняет файлы cookie, а третий использует эти файлы cookie для получения страницы, которую вы после.

#!/bin/sh

# get the login page to get the hidden field data
wget -a log.txt -O loginpage.html http://foobar/default.aspx
hiddendata=`grep value < loginpage.html | grep foobarhidden | tr '=' ' ' | awk '{print $9}' | sed s/"//g`
rm loginpage.html

# login into the page and save the cookies
postData=user=fakeuser'&'pw=password'&'foobarhidden=${hiddendata}
wget -a log.txt -O /dev/null --post-data ${postData} --keep-session-cookies --save-cookies cookies.txt http://foobar/default.aspx

# get the page your after
wget -a log.txt -O results.html --load-cookies cookies.txt http://foobar/lister.aspx?id=42
rm cookies.txt

Там есть полезная информация об этом другом сообщении: superuser → с помощью wget для загрузки pdf файлов с сайта, для которого необходимо установить файлы cookie

Ответ №4

используйте параметры --user=X --password=Y, чтобы указать имя пользователя X и пароль Y.

Asked
4 years, 10 months ago

Viewed
847 times

I already spent several hours googling around lot of sites but I can’t maange to fix i. Let’s hope someone can help me here:

LAPTOP (the file gets downloaded succesfully without issues):

wget -d http://EXAMPLE.COM:80/movie/USERNAME/PASSWORD/video.mkv
DEBUG output created by Wget 1.17.1 on linux-gnu.

Reading HSTS entries from /home/mylinuxuser/.wget-hsts
URI encoding = ‘UTF-8’
--2018-08-03 01:30:35--  http://EXAMPLE.COM/movie/USERNAME/PASSWORD/video.mkv
Resolving EXAMPLE.COM (EXAMPLE.COM)... SOME_IP_ADDRESS, DIFFERENT_IP_ADDRESS
Caching EXAMPLE.COM => SOME_IP_ADDRESS DIFFERENT_IP_ADDRESS
Connecting to EXAMPLE.COM (EXAMPLE.COM)|SOME_IP_ADDRESS|:80... connected.
Created socket 3.
Releasing 0x0000558aa8dbba40 (new refcount 1).

---request begin---
GET /movie/USERNAME/PASSWORD/video.mkv HTTP/1.1
User-Agent: Wget/1.17.1 (linux-gnu)
Accept: */*
Accept-Encoding: identity
Host: EXAMPLE.COM
Connection: Keep-Alive

---request end---
HTTP request sent, awaiting response... 
---response begin---
HTTP/1.1 302 Found
Server: nginx/1.14.0
Date: Fri, 03 Aug 2018 00:14:42 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: keep-alive
Access-Control-Allow-Origin: *
Location: http://ANOTHER_IP_ADDRESS:80/movie/USERNAME/PASSWORD/video.mkv?token=SxoOVkYKEF4WAAcAU1ZTBlBRUFMKBgBUUlFSBQgHAQkBClNVAVoEAVYXGkEVQRYGAAtuCFdHCgddV1wGHEFESlVKOV5RQAhGDA0OUVMDRk9DElgMVkcKB1BSVwUGUAIAAxRER1wGEF4WBgdXXgNGT0MDSRVWF15XCT4AUkYKUlwSAkQVGUBdCmtRUw4HWwBBW0QBQx9HWUUVQ14VcwJDSVhXCFIVNVMWUV1ZFhVQRCETCVAFUQReUkUyAUVGClJcQxpKFVcLRhZVQVNBXBdWUFBXE00RBl9DCxUWThJZE35yGkoVUBpGAVpGXwwIF15BDA1HQx9HWUU6EwFERBFUWF1dFBUPQAJGGBdbAh5qBwwPCFQCRwxfWBZDXhUBQR0bXVcIXkENQDtEXFJBXFsRDw0b

---response end---
302 Found
Registered socket 3 for persistent reuse.
URI content encoding = ‘UTF-8’
Location: http://ANOTHER_IP_ADDRESS:80/movie/USERNAME/PASSWORD/video.mkv?token=SxoOVkYKEF4WAAcAU1ZTBlBRUFMKBgBUUlFSBQgHAQkBClNVAVoEAVYXGkEVQRYGAAtuCFdHCgddV1wGHEFESlVKOV5RQAhGDA0OUVMDRk9DElgMVkcKB1BSVwUGUAIAAxRER1wGEF4WBgdXXgNGT0MDSRVWF15XCT4AUkYKUlwSAkQVGUBdCmtRUw4HWwBBW0QBQx9HWUUVQ14VcwJDSVhXCFIVNVMWUV1ZFhVQRCETCVAFUQReUkUyAUVGClJcQxpKFVcLRhZVQVNBXBdWUFBXE00RBl9DCxUWThJZE35yGkoVUBpGAVpGXwwIF15BDA1HQx9HWUU6EwFERBFUWF1dFBUPQAJGGBdbAh5qBwwPCFQCRwxfWBZDXhUBQR0bXVcIXkENQDtEXFJBXFsRDw0b [following]
] done.
URI content encoding = None
--2018-08-03 01:30:36--  http://ANOTHER_IP_ADDRESS/movie/USERNAME/PASSWORD/video.mkv?token=SxoOVkYKEF4WAAcAU1ZTBlBRUFMKBgBUUlFSBQgHAQkBClNVAVoEAVYXGkEVQRYGAAtuCFdHCgddV1wGHEFESlVKOV5RQAhGDA0OUVMDRk9DElgMVkcKB1BSVwUGUAIAAxRER1wGEF4WBgdXXgNGT0MDSRVWF15XCT4AUkYKUlwSAkQVGUBdCmtRUw4HWwBBW0QBQx9HWUUVQ14VcwJDSVhXCFIVNVMWUV1ZFhVQRCETCVAFUQReUkUyAUVGClJcQxpKFVcLRhZVQVNBXBdWUFBXE00RBl9DCxUWThJZE35yGkoVUBpGAVpGXwwIF15BDA1HQx9HWUU6EwFERBFUWF1dFBUPQAJGGBdbAh5qBwwPCFQCRwxfWBZDXhUBQR0bXVcIXkENQDtEXFJBXFsRDw0b
conaddr is: SOME_IP_ADDRESS
Releasing 0x0000558aa8dbe590 (new refcount 0).
Deleting unused 0x0000558aa8dbe590.
Connecting to ANOTHER_IP_ADDRESS:80... connected.
Created socket 4.
Releasing 0x0000558aa8dbe590 (new refcount 0).
Deleting unused 0x0000558aa8dbe590.

---request begin---
GET /movie/USERNAME/PASSWORD/video.mkv?token=SxoOVkYKEF4WAAcAU1ZTBlBRUFMKBgBUUlFSBQgHAQkBClNVAVoEAVYXGkEVQRYGAAtuCFdHCgddV1wGHEFESlVKOV5RQAhGDA0OUVMDRk9DElgMVkcKB1BSVwUGUAIAAxRER1wGEF4WBgdXXgNGT0MDSRVWF15XCT4AUkYKUlwSAkQVGUBdCmtRUw4HWwBBW0QBQx9HWUUVQ14VcwJDSVhXCFIVNVMWUV1ZFhVQRCETCVAFUQReUkUyAUVGClJcQxpKFVcLRhZVQVNBXBdWUFBXE00RBl9DCxUWThJZE35yGkoVUBpGAVpGXwwIF15BDA1HQx9HWUU6EwFERBFUWF1dFBUPQAJGGBdbAh5qBwwPCFQCRwxfWBZDXhUBQR0bXVcIXkENQDtEXFJBXFsRDw0b HTTP/1.1
User-Agent: Wget/1.17.1 (linux-gnu)
Accept: */*
Accept-Encoding: identity
Host: ANOTHER_IP_ADDRESS
Connection: Keep-Alive

---request end---
HTTP request sent, awaiting response... 
---response begin---
HTTP/1.1 200 OK
Server: nginx
Date: Fri, 03 Aug 2018 00:29:53 GMT
Content-Type: video/x-matroska
Content-Length: 2071078584
Connection: keep-alive
Accept-Ranges: 0-2071078584
Content-Range: bytes 0-2071078583/2071078584

---response end---
200 OK
Disabling further reuse of socket 3.
Closed fd 3
Registered socket 4 for persistent reuse.
Length: 2071078584 (1.9G) 
Saving to: ‘video.mkv’

VPS (it doesn’t download anything as it can be seen below):

wget -d http://EXAMPLE.COM:80/movie/USERNAME/PASSWORD/video.mkv
DEBUG output created by Wget 1.19.4 on linux-gnu.

Reading HSTS entries from /root/.wget-hsts
URI encoding = ‘UTF-8’
Converted file name 'video.mkv' (UTF-8) -> 'video.mkv' (UTF-8)
--2018-08-03 00:35:19--  http://EXAMPLE.COM/movie/USERNAME/PASSWORD/video.mkv
Resolving EXAMPLE.COM (EXAMPLE.COM)... SOME_IP_ADDRESS, DIFFERENT_IP_ADDRESS
Caching EXAMPLE.COM => SOME_IP_ADDRESS DIFFERENT_IP_ADDRESS
Connecting to EXAMPLE.COM (EXAMPLE.COM)|SOME_IP_ADDRESS|:80... connected.
Created socket 3.
Releasing 0x0000561201148270 (new refcount 1).

---request begin---
GET /movie/USERNAME/PASSWORD/video.mkv HTTP/1.1
User-Agent: Wget/1.19.4 (linux-gnu)
Accept: */*
Accept-Encoding: identity
Host: EXAMPLE.COM
Connection: Keep-Alive
Referer: EXAMPLE.COM

---request end---
HTTP request sent, awaiting response... 
---response begin---
HTTP/1.1 401 Unauthorized
Server: nginx/1.14.0
Date: Fri, 03 Aug 2018 00:19:26 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: keep-alive
Access-Control-Allow-Origin: *

---response end---
401 Unauthorized
Registered socket 3 for persistent reuse.
] done.

Username/Password Authentication Failed.

Does anyone know what command should I run in the VPS or what else I should install so it can download the file?

asked Aug 3, 2018 at 0:48

Juan Martin's user avatar

18

Load 7 more related questions

Show fewer related questions

Здравствуйте. Почему при использовании cookies в Wget для авторизации на сайте, авторизация работает только для первой страницы? Сайт скачивается, но я авторизирован только на сайт/index.php. Пробовал по разному:

wget -x --load-cookies C:cookies.txt "сайт" --page-requisites -r -l 10 "сайт"

wget -x --load-cookies C:cookies.txt --page-requisites -r -l 10 "сайт"

wget --load-cookies C:cookies.txt --page-requisites -r -l 10 "сайт"

wget -x --load-cookies C:cookies.txt --mirror --convert-links --page-requisites -r -l 10 "сайт"

и ещё много как…
Что не так? Почему логиниться только на /index.php? Для получения cookies, использовал расширение для Google Chrome «cookies.txt».

  • Wfm 54 коды ошибок
  • Wfcrun32 exe ошибка при запуске приложения
  • Wf8452s9p samsung ошибка be
  • Wf03 ошибка unox как исправить
  • Wf прекратил работу отчет об ошибке как исправить