For latest updates check out my Apache Spark exploration @ https://github.com/Mageswaran1989/aja/
How many tasks can I run in my cluster?
M(Machines) * E(Executors/Machine) * C(Cores per Executor / 1 core for each task) = T(Tasks)
Eg: 12 * 2 * 12 = 288 tasks
How much amount of RAM is under usage?
0.9 (spark.storage.safetyFraction) * 0.6 (spark.storage.memoryFraction) * M (Machines) * E (Executors/Machine) * S (Size in GBs per executor)
No comments:
Post a Comment