FAQ¶
- Scheduling Workflows
- Custom Nodes
- Distributions Supported
- Workflow Export - Import
- Submit Apache Spark Jobs
- Multi User Support
- Data Sources
- Hadoop Installation Pre-Requisites
- Linux
- JDK
- Disable IPV6
- Selinux
- Steps Involved in Installing Hadoop
- After Installation of Cloudera Manager
- Add proxy user in HDFS
- Create HDFS directory
- Install Spark2
- Login Again into Cloudera Manager
- In YARN increase Container memory to 8GB
- AFTER INSTALLATION GET CDH TO USE JAVA 8
- Install Sparkflows
- Upload the Fire Insights example data directory onto HDFS
- Log into Fire Insights