I am writing a script to move/download the contents of a shared folder on gdrive onto a local computer. This shared folder will be updated by other users and I need to perform this daily - at present there are 200-400 files. So far I have created oauth credentials, signed in to allow access, enabled the drive api. I can retrieve a list of all files with a particular parent id (id of shared folder) and then start to download all the listed files that are not present/updated since the last run. Unfortunately after successfully downloading the first file it fails to download the second file - I get the error: "Daily Limit for Unauthenticated Use Exceeded. Continued use requires signup."
If I restart the script it logs in, gets the list, skips the first file as intended (the modified time on gdrive is not more recent than modified time locally), downloads the second file then throws the same error when downloading the third file. I can repeat this process and each time only fetching one more file.
My sign_in() function based on the drive quickstart.py login. credentials.json are read, user is given url to visit/approve and a token is stored in token.pickle
I thought that my session was expiring very quickly so at
#download latest copy of file
I tried to login again and create a new service to pass (after each file), but I still was refused after the first file was downloaded.
Here are my sign_in, download_to_folder and the main functions.
SCOPES = ['https://www.googleapis.com/auth/drive.readonly']
def download_to_folder(service,from_file,dest_file,dest_folder='./'):
#DOWNLOAD CHUNK
#check to make folder
check_if_folder_exists(dest_folder,create=True)
request = service.files().export_media(fileId=from_file['id'],
mimeType=from_file['mimeType'])
fh = io.FileIO(dest_folder+'/'+dest_file,'wb')
downloader = http.MediaIoBaseDownload(fh, request)
done = False
while done is False:
status, done = downloader.next_chunk()
print("Download %d%%." % int(status.progress() * 100))
def sign_in():
"""Shows basic usage of the Drive v3 API.
Prints the names and ids of the first 10 files the user has access to.
"""
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'./client_secret.json', SCOPES)
creds = flow.run_local_server(port=12345)
# Save the credentials for the next run
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
return creds
def main():
creds = sign_in()
service = build('drive', 'v3', credentials=creds)
folder_id = 'REDACTED_FOLDER_ID'
#get items in shared drive folder
items = list_items_with_parent(service,folder_id)
#get sub-folder from items
sub_folders = [item for item in items if item['mimeType']=='application/vnd.google-apps.folder']
for folder in sub_folders:
sub_folder_items = list_items_with_parent(service,folder['id'])
print('folder: '+folder['name'])
#set new dest folder for this folder
dest_folder='./test_folder/'+folder['name']
for sub_item in sub_folder_items:
if(sub_item['mimeType']=='application/vnd.google-apps.folder'):
#skip sub folders for now
print('Skipping sub folder :|')
pass
else:
#check and update files as necessary
#check mtime
last_mod_time = get_last_mod_time(dest_folder+'/'+sub_item['name'])
#if does not exist or outdated - copy new
if(last_mod_time):
print('last modificationTime'+last_mod_time.strftime('%Y-%m-%d %H:%M:%s'))
else:
print(sub_item['name']+' not found...')
print('gdrive last mod time: '+sub_item['modifiedTime'])
utc_time = datetime.datetime.strptime(sub_item['modifiedTime'], "%Y-%m-%dT%H:%M:%S.%fZ")
print('gdrive last mod time: '+str(utc_time))
if(last_mod_time==False or last_mod_time < utc_time):
#download latest copy of file
##creds = sign_in()
##service = build('drive', 'v3', credentials=creds)
##the line below is where the script crashes on the second file to be downloaded
download_to_folder(service,sub_item,sub_item['name'],dest_folder='./test_folder/'+folder['name'])
else:
print('File: '+sub_item['name']+' up to date.')
if __name__ == '__main__':
items = main()
I have also tried this on a private folder (not shared), in case I had hit some download limit but again it only downloaded the first file.
According to the dev console I have made <700 requests today. I see that this error comes up in many google searches and all suggest the oauth is not properly applied. If that is the case can someone advise how I can check?
creds.valid returns True
creds.expired returns False
admittedly: creds.expiry returns a time in the past that is the same as token.pickle creation, but I do not know if this is wrong.