I am crawling data from an URL and for crawling using beautiful soup. I want to store that crawled data into an AZURE BLOB STORAGE as a blob. below is my code when am saving data in my local, same thing I want to perform for direct upload into Azure.
soup = BeautifulSoup(urlopen('www.abc.html'))
outfile = open('C:\\Users\\ADMIN\\filename.txt','w')
data = soup.encode("ascii","ignore")
outfile.write(data)
outfile.close
this code successfully saving data of website in my local folder, please help me with saving the data of the same website in azure blob storage directly. I have key and account in AZURE BLOB STORAGE.
soup=BeautifulSoup(urlopen('www.abc.html'))
data = soup.encode("ascii","ignore")
block_blob_service.create_blob_from_text('containername', 'filename.txt', data)
I am trying above piece of code but its not working.