0
votes

Can anyone please let me know the minimum RAM required (of the host machine) for running Cloudera's hadoop on VMware workstation?

I have 6GB of RAM. The documentation says that the RAM required by the VM is 4 GB.

Still, when I run it, the CentOS is loaded and the VM crashes. I have no other active application running at the time.

Are there any other options apart from installing hadoop manually?

1
4GB free RAM, probably. And last time I checked, it said 8GB - OneCricketeer
So you mean that 8GB of RAM is required in my system? Can you suggest any other way/vendor that might work under this constraint? (6 GB RAM) - Chica_Programmador
I've been able to get a simple automated Ambari managed installation in under 2GB. The problem with CDH is that all the Hadoop services are installed for you. It really isn't clear what your end goal is, but if you want all of Hadoop, go buy some RAM, it's the cheapest part to upgrade - OneCricketeer
For example, if all you want is Apache Spark. Then, you do not need Hadoop. - OneCricketeer
All I need is to run simple word count programs on file size in MBs...But it has to be in hadoop for an assignment..upgrading RAM isn't an option for me. Thanks! - Chica_Programmador

1 Answers

0
votes

You may be running into your localhost running out or memory or some other issue preventing the machine from booting completely. There are a couple of other options if you don’t want to deal with a manual install:

  • If you have access to a docker environment try the the docker image they provide.
  • Run it in the cloud with AWS, GCE, Azure, they usually have a small allotment of personal/student credits available.
    • For AWS, EMR also makes it easy for you to run something repeatedly.
  • For really short durations, you could try the demo from Bitnami (https://bitnami.com/stack/hadoop) and just run whatever you need to there.