scala - How can I benchmark performance in Spark console? -


i have started using spark , interactions revolve around spark-shell @ moment. benchmark how long various commands take, not find how time or run benchmark. ideally want super-simple, such as:

val t = [current_time] data.map(etc).distinct().reducebykey(_ + _) println([current time] - t) 

edit: figured out --

import org.joda.time._ val t_start = datetime.now() [[do stuff]] val t_end = datetime.now() new period(t_start, t_end).tostandardseconds() 

i suggest following :

def time[a](f: => a) = {   val s = system.nanotime   val ret = f   println("time: " + (system.nanotime - s) / 1e9 + " seconds")   ret } 

you can pass function argument time function , compute result of function giving time taken function performed.

let's consider function foobar take data argument , following :

val test = time(foobar(data)) 

test contains result of foobar , you'll time needed well.


Comments

Popular posts from this blog

Magento/PHP - Get phones on all members in a customer group -

php - .htaccess mod_rewrite for dynamic url which has domain names -

Website Login Issue developed in magento -