Method 1: Use remote_api
How to : write a bulkloader.yaml file and run it directly using “appcfg.py upload_data” command from terminal
I don’t recommend this method for a couple of reasons: 1. huge latency 2. no support for NDB
Method 2: GCS and use mapreduce
Uploading Data File to GCS:
Use the “storage-file-transfer-json-python” github project (chunked_transfer.py) to upload files to gcs from your local system.
Make sure to generate proper “client-secrets.json” file from the app engine admin console.
Mapreduce:
Use the "appengine-mapreduce" github project. Copy the "mapreduce" folder to your project top-level folder.
Add the below line to your app.yaml file:
includes:
- mapreduce/include.yaml
Below is your main.py file
import cgi
import webapp2
import logging
import os, csv
from models import DataStoreModel
import StringIO
from google.appengine.api import app_identity
from mapreduce import base_handler
from mapreduce import mapreduce_pipeline
from mapreduce import operation as op
from mapreduce.input_readers import InputReader
def testmapperFunc(newRequest):
f = StringIO.StringIO(newRequest)
reader = csv.reader(f, delimiter=',')
for row in reader:
newEntry = DataStoreModel(attr1=row[0], link=row[1])
yield op.db.Put(newEntry)
class TestGCSReaderPipeline(base_handler.PipelineBase):
def run(self, filename):
yield mapreduce_pipeline.MapreducePipeline(
"test_gcs",
"testgcs.testmapperFunc",
"mapreduce.input_readers.FileInputReader",
mapper_params={
"files": [filename],
"format": 'lines'
},
shards=1)
class tempTestRequestGCSUpload(webapp2.RequestHandler):
def get(self):
bucket_name = os.environ.get('BUCKET_NAME',
app_identity.get_default_gcs_bucket_name())
bucket = '/gs/' + bucket_name
filename = bucket + '/' + 'tempfile.csv'
pipeline = TestGCSReaderPipeline(filename)
pipeline.with_params(target="mapreducetestmodtest")
pipeline.start()
self.response.out.write('done')
application = webapp2.WSGIApplication([
('/gcsupload', tempTestRequestGCSUpload),
], debug=True)
To remember:
- Mapreduce project uses the now-deprecated “Google Cloud Storage Files API”. So support in future is not guaranteed.
- Map reduce adds a small overhead to datastore reads and writes.
Method 3: GCS and GCS Client Library
- Upload the csv/text file to gcs using the above file-transfer method.
- Use gcs client library (copy the 'cloudstorage' folder to your application top-level folder).
Add the below code to the application main.py file.
import cgi
import webapp2
import logging
import jinja2
import os, csv
import cloudstorage as gcs
from google.appengine.ext import ndb
from google.appengine.api import app_identity
from models import DataStoreModel
class UploadGCSData(webapp2.RequestHandler):
def get(self):
bucket_name = os.environ.get('BUCKET_NAME',
app_identity.get_default_gcs_bucket_name())
bucket = '/' + bucket_name
filename = bucket + '/tempfile.csv'
self.upload_file(filename)
def upload_file(self, filename):
gcs_file = gcs.open(filename)
datareader = csv.reader(gcs_file)
count = 0
entities = []
for row in datareader:
count += 1
newProd = DataStoreModel(attr1=row[0], link=row[1])
entities.append(newProd)
if count%50==0 and entities:
ndb.put_multi(entities)
entities=[]
if entities:
ndb.put_multi(entities)
application = webapp2.WSGIApplication([
('/gcsupload', UploadGCSData),
], debug=True)