I am trying to play with text mining tools that the R language offers but I am facing the following problem since I am running in an old machine.
I want to create a Document Term Matrix using the tm package and the Corpus function. When I create the DTM I receive an error that can allocate memory of 4GB (My machine has 2 GB of memory). How in general do you face such a problem? For example, in general applications the DTM should be much greater than my matrix. Is there a way to use an SQL database instead of using the memory?
//I have studied a releated post about using the sqldf library in order to create a temporary sqlite database. But in this case I can not even create the matrix.