0
votes

I am building a machine learning model using large enough dataset (2 GB), which can not be run in my local machine, thus I decided to use google cloud datalab. I have successfully created VM but I could not find how to import data like we can in local machine (using pandas read_csv). My data is in google drive. Is there any simple way, which I can use for this problem.

1

1 Answers

0
votes
# Excute below code in a cell
from google.colab import drive
drive.mount('/mount_unique',force_remount=True)
# HERE YOU WILL BE ASKED TO navigate to an URL, click on it and copy the authentication token to clipboard
# Paste in the text box shown and enter.

# execute the below code to list directories on your google drive
from os import listdir
print(listdir('/mount_unique'))

# print(listdir('/mount_unique/My Drive')) #lists contents of your drive

import pandas as pd
pd.read_csv('/mount_unique/My Drive/<YOUR_DATA_FILE_PATH>')