I need to import every day a lot of data from our local SQL Server to Bigtable. 100-200 million rows every day.
I tried to send the data to Bigtable by it's write API, but it was very slow (like 20M per hour).
I found that it can be much faster to upload to Bigtable files from google storage by google-cloud dataflow. but it seems to me it's too complicated and unnecessary to export from SQL to file, then upload the file, then import the file.
I hope to find a simpler solution that will enable batch processing from SQL to Bigtable without using files.
If someone can give me links/description of what should be the best here it'll be great.