Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ACCESS TOKEN not regenerating multiple times in a single execution of script #113

Closed
renatowow14 opened this issue Oct 12, 2020 · 75 comments
Labels

Comments

@renatowow14
Copy link

Command: gupload /home/user/folder

this command reproduces error, I have 826 sub-folders and 7593 files

/root/.google-drive-upload/bin/gupload: line 642:: File or directory not found
/root/.google-drive-upload/bin/gupload: line 647: /usr/bin/file: Too long argument list
/root/.google-drive-upload/bin/gupload: line 647: /usr/bin/mimetype: Too long argument list

@Akianonymus
Copy link
Collaborator

Run gupload --info and send output.

Also, run gupload -D /home/user/folder 2>| log 1>&2

send log file contents or just the file

@renatowow14
Copy link
Author

Run gupload --info and send output.

Also, run gupload -D /home/user/folder 2>| log 1>&2

send log file contents or just the file

I apologize my problem was on the absolute path I was going through, when moving the folder to the / root directory for example and uploading it worked, but now I have errors to upload all the files, this is missing several of them, I ran the command that me was informed and I will send the log output here as soon as it is finished, even in the part that starts to give error.

the directory I was trying to send when I had an upload error earlier was like:
/home/renato/Área\ de\ Trabalho/BKP_04052020/LAPIG

@renatowow14
Copy link
Author

renatowow14 commented Oct 13, 2020

Follows the log file of the attempt to upload a folder containing several subfolders and files for my gdrive, but without success, I await an answer, thanks for the support.

log.gz ( link removed due to sensitive information )

Status: 602 Uploaded | 3772 Failed

@renatowow14
Copy link
Author

gupload --info:

REPO: "labbots/google-drive-upload"
INSTALL_PATH: "/root/.google-drive-upload/bin"
INSTALLATION: "bash"
TYPE: "release"
TYPE_VALUE: "latest"
LATEST_INSTALLED_SHA: "7a64a39ec274ac8fca855adbb6f93edc7c6f96c2"
CONFIG: "/root/.googledrive.conf

@Akianonymus
Copy link
Collaborator

So, i looked at the log today. The problem is that if access token expires before uploading all the files, it doesn't renew it automatically untill the script is executed again.

Just have to add the required code for that. Will try to do asap.

@Akianonymus Akianonymus changed the title Problem uploading folder ACCESS TOKEN not regenerating multiple times in a single execution of script Oct 19, 2020
@renatowow14
Copy link
Author

renatowow14 commented Oct 19, 2020

So, i looked at the log today. The problem is that if access token expires before uploading all the files, it doesn't renew it automatically untill the script is executed again.

Just have to add the required code for that. Will try to do asap.

ok thank you very much, besides this problem, i'm having problems and to upload files of 200GB each, i'm having to upload a backup to the drive but with big files it is not working: /

@Akianonymus
Copy link
Collaborator

Akianonymus commented Oct 19, 2020

@renatowow14

Wow, those are quite big files, i really never tested on such big files.

Can you give more info on what error occurs ? 🤔

Also, create a new issue for it so i can track it properly

For grabbing logs, use below commands ( will hide sensitive information from logs )


 gupload -D filename 2>| log 1>&2

for i in log; do
    values="$(grep -oE "(CLIENT_ID|CLIENT_SECRET|REFRESH_TOKEN|ACCESS_TOKEN|ROOT_FOLDER|WORKSPACE_FOLDER_ID)=.*" "${i}")"
    eval "${values}"

    for j in CLIENT_ID CLIENT_SECRET REFRESH_TOKEN ACCESS_TOKEN ROOT_FOLDER WORKSPACE_FOLDER_ID; do
         string="$(eval printf "%s" \"\$"${j}"\")"
        sed "s|${string}|"*****"|g" -i "${i}"
    done
done

Change filename to your file name

Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 19, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, will update before 3 seconds of expiry

process will be killed when script exits

Fix labbots#113
Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 19, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, will update before 3 seconds of expiry

process will be killed when script exits

Fix labbots#113
Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 19, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, will update before 3 seconds of expiry

process will be killed when script exits

Fix labbots#113
@Akianonymus
Copy link
Collaborator

So, i have pushed some fixes.

Run below command to install the gupload command with fixes. Command name will be test_gupload , use the same to test.

curl --compressed -Ls https://github.com/labbots/google-drive-upload/raw/master/install.sh | sh -s -- -r akianonymus/google-drive-upload -B master -c test_gupload

Different command name so the existing installation doesn't need to be touched.

@renatowow14
Copy link
Author

Então, eu empurrei algumas correções.

Execute o comando abaixo para instalar o comando gupload com correções. O nome do comando será test_gupload, use o mesmo para testar.

curl --compressed -Ls https://github.com/labbots/google-drive-upload/raw/master/install.sh | sh -s -- -r akianonymus/google-drive-upload -B master -c test_gupload

Nome de comando diferente para que a instalação existente não precise ser alterada.

all right thanks for the support, I will try to send this backup folder with files of 200GB each to my drive, in case of errors I send the log file using the commands mentioned above to capture the logs.

@renatowow14
Copy link
Author

renatowow14 commented Oct 20, 2020

Unfortunately I can't upload all my files, he managed to upload only one of them, after the first error I tried to upload again using the -d option to skip existing ones but he says that all files are already there, but there is only one file there 243GB, I will leave the logs of the two attempts below for analysis.

logs ( link removed for sensitive info )

Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 20, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, try to update before 5 mins of expiry, a fresh token gets 60 mins ( 3600 seconds )

process will be killed when script exits

Fix labbots#113
Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 20, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, try to update before 5 mins of expiry, a fresh token gets 60 mins ( 3600 seconds )

process will be killed when script exits

Fix labbots#113
@Akianonymus
Copy link
Collaborator

I have pushed some new fixes

run below commands to update the test gupload command

curl --compressed -Ls https://github.com/labbots/google-drive-upload/raw/master/install.sh | sh -s -- -r akianonymus/google-drive-upload -B master -c test_gupload

For grabbing logs, use below commands ( will hide sensitive information from logs )


 test_gupload -D filename 2>| log 1>&2

for i in log; do
    values="$(grep -oE "(CLIENT_ID|CLIENT_SECRET|REFRESH_TOKEN|ACCESS_TOKEN|ROOT_FOLDER|WORKSPACE_FOLDER_ID)=.*" "${i}")"
    eval "${values}"

    for j in CLIENT_ID CLIENT_SECRET REFRESH_TOKEN ACCESS_TOKEN ROOT_FOLDER WORKSPACE_FOLDER_ID; do
         string="$(eval printf "%s" \"\$"${j}"\")"
        sed "s|${string}|"*****"|g" -i "${i}"
    done
done

Change filename to your file name

I am writing this again because you didn't run all the commands and because of that your sensitive information was again visible in logs.

Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 20, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, try to update before 5 mins of expiry, a fresh token gets 60 mins ( 3600 seconds )

process will be killed when script exits

create a temp file where updated access token will be stored by the bg service

every function that uses access token will source it on every call

make a new function named _api_request for all oauth network calls

Fix labbots#113
Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 20, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, try to update before 5 mins of expiry, a fresh token gets 60 mins ( 3600 seconds )

process will be killed when script exits

create a temp file where updated access token will be stored by the bg service

every function that uses access token will source it on every call

make a new function named _api_request for all oauth network calls

Fix labbots#113
Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 20, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, try to update before 5 mins of expiry, a fresh token gets 60 mins ( 3600 seconds )

process will be killed when script exits

create a temp file where updated access token will be stored by the bg service

every function that uses access token will source it on every call

make a new function named _api_request for all oauth network calls

Fix labbots#113

google-oauth2.[bash|sh]: Apply new changes
@renatowow14
Copy link
Author

Eu empurrei algumas novas correções

execute os comandos abaixo para atualizar o comando test gupload

curl --compressed -Ls https://github.com/labbots/google-drive-upload/raw/master/install.sh | sh -s -- -r akianonymus/google-drive-upload -B master -c test_gupload

Para obter registros, use os comandos abaixo (ocultará informações confidenciais dos registros)


 test_gupload -D filename 2>| log 1>&2

for i in log; do
    values="$(grep -oE "(CLIENT_ID|CLIENT_SECRET|REFRESH_TOKEN|ACCESS_TOKEN|ROOT_FOLDER|WORKSPACE_FOLDER_ID)=.*" "${i}")"
    eval "${values}"

    for j in CLIENT_ID CLIENT_SECRET REFRESH_TOKEN ACCESS_TOKEN ROOT_FOLDER WORKSPACE_FOLDER_ID; do
         string="$(eval printf "%s" \"\$"${j}"\")"
        sed "s|${string}|"*****"|g" -i "${i}"
    done
done

Mude o nome do arquivo para o seu nome de arquivo

Estou escrevendo novamente porque você não executou todos os comandos e, por isso, suas informações confidenciais estavam novamente visíveis nos logs.

okay, I ended up not paying attention to that detail, I'll run it again and bring the result logs.

Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 20, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, try to update before 5 mins of expiry, a fresh token gets 60 mins ( 3600 seconds )

process will be killed when script exits

create a temp file where updated access token will be stored by the bg service

every function that uses access token will source it on every call

make a new function named _api_request for all oauth network calls

decrease a network request ( to fetch expiry of access token, just calculate it locally using remaining time given in json as expires_in )

Fix labbots#113

google-oauth2.[bash|sh]: Apply new changes
Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 21, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, try to update before 5 mins of expiry, a fresh token gets 60 mins ( 3600 seconds )

process will be killed when script exits

create a temp file where updated access token will be stored by the bg service

every function that uses access token will source it on every call

make a new function named _api_request for all oauth network calls

decrease a network request ( to fetch expiry of access token, just calculate it locally using remaining time given in json as expires_in )

Fix labbots#113

google-oauth2.[bash|sh]: Apply new changes
Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 21, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, try to update before 5 mins of expiry, a fresh token gets 60 mins ( 3600 seconds )

process will be killed when script exits

create a temp file where updated access token will be stored by the bg service

every function that uses access token will source it on every call

make a new function named _api_request for all oauth network calls

decrease a network request ( to fetch expiry of access token, just calculate it locally using remaining time given in json as expires_in )

Fix labbots#113

google-oauth2.[bash|sh]: Apply new changes
Akianonymus added a commit to Akianonymus/google-drive-upload that referenced this issue Oct 21, 2020
launch a background service to check access token and update it

checks ACCESS_TOKEN_EXPIRY, try to update before 5 mins of expiry, a fresh token gets 60 mins ( 3600 seconds )

process will be killed when script exits

create a temp file where updated access token will be stored by the bg service

every function that uses access token will source it on every call

make a new function named _api_request for all oauth network calls

decrease a network request ( to fetch expiry of access token, just calculate it locally using remaining time given in json as expires_in )

Fix labbots#113

google-oauth2.[bash|sh]: Apply new changes
@renatowow14
Copy link
Author

I was successful in uploading only one file of 243GB the rest it failed, I don't know what else to do, I've tried several scripts besides yours but without success, I believe there is some limitation on the part of google.
log.zip

@Akianonymus
Copy link
Collaborator

Akianonymus commented Oct 23, 2020

Have you tried uploading those files one by one ?

About limitation by google, it is 750 gb per day.

@renatowow14
Copy link
Author

renatowow14 commented Oct 23, 2020

Você já tentou enviar esses arquivos um por um?

Sobre a limitação do google, é 750 GB por dia.

I'm doing this at the moment as a last alternative, I put the next 243GB file to go up to the drive, inside the folder that already existed there, I used the following parameter, script /folder_name/file_name folder_name on the google drive.

@renatowow14
Copy link
Author

I couldn't help noticing this error in the logs, apparently something related to credentials and authorization.

"error": {
"errors": [
{
"domain": "global",
"reason": "authError",
"message": "Invalid Credentials",
"locationType": "header",
"location": "Authorization"
}
],
"code": 401,
"message": "Invalid Credentials"
}
}'

@renatowow14
Copy link
Author

I realized that sending the file to an existing folder didn't work, it is sending to the root of my drive, how can I send a file to an existing folder on the drive, could you tell me the syntax?

@renatowow14
Copy link
Author

sorry it was not clear to me, so the error that appeared in the last logs I sent is due to the limitation of google right?

@Akianonymus
Copy link
Collaborator

Akianonymus commented Oct 29, 2020

@renatowow14 Yes, you can increase daily upload limit of your account. You will have to use service accounts. I will try to integrate with script when i get some time.

Reference: https://cloud.google.com/iam/docs/service-accounts

I don't understand, what data are your referring too ? 🤔

@renatowow14
Copy link
Author

renatowow14 commented Oct 29, 2020

I am wondering if the only problem was due to the 750GB limit, I will send the error part of the log that I sent.

{
"error": {
"errors": [
{
"domain": "usageLimits",
"reason": "userRateLimitExceeded",
"message": "User rate limit exceeded."
}
],
"code": 403,
"message": "User rate limit exceeded."
}
}'

@renatowow14
Copy link
Author

@Akianonymus
Copy link
Collaborator

So I read this problem and as my quota was exceeded, but I'm not sure if it's correct, I didn't go up so many subfolders to have crossed the limit, I believe it could be something in the script.

https://developers.google.com/drive/api/v3/handle-errors#:~:text=Resolve%20a%20403%20error%3A%20User%20rate%20limit%20exceeded,-A%20userRateLimitExceeded%20error&text=%7D-,To%20fix%20this%20error%3A,in%20the%20Developer%20Console%20project.&text=If%20one%20user%20is%20making,(setting%20the%20quotaUser%20parameter).

https://developers.google.com/drive/api/v3/handle-errors#quota

If you think it's because of script, then try uploading again, it should upload if you have not exceeded the qouta.

@renatowow14
Copy link
Author

Então li esse problema e como minha cota foi ultrapassada, mas não tenho certeza se está correto, não subi tantas subpastas para ter ultrapassado o limite, acredito que pode ser algo no script.
https://developers.google.com/drive/api/v3/handle-errors#:~:text=Resolve%20a%20403%20error%3A%20User%20rate%20limit%20exceeded,-A%20userRateLimitExceeded%20error&text= % 7D-, To% 20fix% 20this% 20error% 3A, in% 20the% 20Developer% 20Console% 20project. & Text = If% 20one% 20user% 20is% 20making, (definindo% 20the% 20quotaUser% 20parameter) .
https://developers.google.com/drive/api/v3/handle-errors#quota

Se você acha que é por causa do script, tente enviar novamente, ele deve enviar se você não tiver excedido o qouta.

I just put to upload a file that was missing from the folder and is loading, now I don't know what to say.

@renatowow14
Copy link
Author

In short, he loaded all files in the folder except for a 15GB one, this gave the error that appears at the end of the last log I sent you.

@Akianonymus
Copy link
Collaborator

Akianonymus commented Oct 29, 2020

So, I think i misunderstood your problem, it's more of an api error, rather than storage qouta error.

Basically, a solution would be to use -R flag, so it retries itself in case of error.

@renatowow14
Copy link
Author

I used the -d option to skip the existing ones, but apparently I reached the quota limit, are there any suggestions?

Captura de tela de 2020-10-28 23-14-22

@Akianonymus
Copy link
Collaborator

I used the -d option to skip the existing ones, but apparently I reached the quota limit, are there any suggestions?

Captura de tela de 2020-10-28 23-14-22

Best would be try after sometime. Wait for 5 - 10 mins to see if it changes.

@renatowow14
Copy link
Author

A long time has passed, i will try to send again, in case there are errors come back here, but apparently the problem of uploading large files has been solved, now it seems to me to be a google limit problem.

@renatowow14
Copy link
Author

I used the -d option to skip the existing ones, but apparently I reached the quota limit, are there any suggestions?
Captura de tela de 2020-10-28 23-14-22

Best would be try after sometime. Wait for 5 - 10 mins to see if it changes.

unfortunately it didn't work, tomorrow I will try to increase the quota limit, will I be able to send more? : /

@Akianonymus
Copy link
Collaborator

I used the -d option to skip the existing ones, but apparently I reached the quota limit, are there any suggestions?
Captura de tela de 2020-10-28 23-14-22

Best would be try after sometime. Wait for 5 - 10 mins to see if it changes.

unfortunately it didn't work, tomorrow I will try to increase the quota limit, will I be able to send more? : /

can you test to upload any other file ? To check if it works.

@renatowow14
Copy link
Author

Usei a opção -d para ignorar os existentes, mas aparentemente atingi o limite da cota, há alguma sugestão?
Captura de tela de 2020-10-28 23-14-22

Melhor seria tentar depois de algum tempo. Aguarde 5 a 10 minutos para ver se ele muda.

infelizmente não funcionou, amanhã tentarei aumentar o limite da cota, poderei enviar mais? : /

você pode testar para carregar qualquer outro arquivo? Para verificar se funciona.

yes, i tried another and it doesn't work.

Captura de tela de 2020-10-29 00-16-17

@Akianonymus
Copy link
Collaborator

Akianonymus commented Oct 29, 2020

Yeah, seems like you hit hard limit, you cannot upload anything for today.

Only solution is to use service accounts.

@renatowow14
Copy link
Author

Sim, parece que você atingiu o limite rígido, você não pode enviar nada por hoje.

A única solução é usar contas de serviço.

I'll try to increase the limit tomorrow, and try to send it again tomorrow, thanks for the support.

@Akianonymus
Copy link
Collaborator

Akianonymus commented Oct 29, 2020

@renatowow14 I will try to add the service account feature soon.

I have created a issue to track it

#122

@Akianonymus
Copy link
Collaborator

I have added some code to print to the user if upload limit is reached.

Refrence: rclone/rclone#3857 (comment)

@renatowow14
Copy link
Author

I can't upload anything yet, I'm worried now, I'll try to increase the limit: /

Captura de tela de 2020-10-29 10-45-15

@renatowow14
Copy link
Author

I can't upload anything yet, I'm worried now, I'll try to increase the limit :/

Captura de tela de 2020-10-29 10-45-15

@Anon-Exploiter
Copy link

Anon-Exploiter commented Oct 29, 2020

I can't upload anything yet, I'm worried now, I'll try to increase the limit: /

Captura de tela de 2020-10-29 10-45-15

You can just create a new project, delete the old one, generate oauth credentials for it and start using it. If that doesn't work, switch your google account and do the same for it, simple.

@Akianonymus
Copy link
Collaborator

I can't upload anything yet, I'm worried now, I'll try to increase the limit: /

Captura de tela de 2020-10-29 10-45-15

Typically, this should renew tomorrow, after a day.

@renatowow14
Copy link
Author

Ainda não consigo enviar nada, estou preocupado agora, vou tentar aumentar o limite: /
Captura de tela de 2020-10-29 10-45-15

Normalmente, isso deve ser renovado amanhã, após um dia.

all right, i'll try to renew asking for a new quota today, if not right, i wait 24 hours, but thanks again.

@renatowow14
Copy link
Author

will you update the fixes for the master repository?

@renatowow14
Copy link
Author

Ainda não consigo enviar nada, estou preocupado agora, vou tentar aumentar o limite: /
Captura de tela de 2020-10-29 10-45-15

Você pode apenas criar um novo projeto, excluir o antigo, gerar credenciais oauth para ele e começar a usá-lo. Se isso não funcionar, mude sua conta do Google e faça o mesmo por ela, simples.

it may be an option, I will try it if nothing works.

@Akianonymus
Copy link
Collaborator

Akianonymus commented Oct 29, 2020

Just wait for some, i have almost updated the script for using service accounts. Need to finish it up.

@Akianonymus
Copy link
Collaborator

The fixes has been merged and a new release is created

https://github.com/labbots/google-drive-upload/releases/tag/v3.4.3

For further discussions on rate limit error, move on to #122

@renatowow14
Copy link
Author

right, so I can already use the master upload script which will already have corrections right? Thank you again.

@Akianonymus
Copy link
Collaborator

@renatowow14 Yes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants