Hey fellow devs, I work in a media analytics startup. I work primarily in java and apache spark, sometimes a little of python. I love solving problems, it's like problems are asking me to solve them :P
Okay answering my own question, being a big data analytics company we are sometimes tasked with processing huge amounts of data in a short period of time. Enter apache spark, a cluster computing framework which allows you to run your code parallely in thousands of cores. I was once running a piece of code powered by 3600 cores. I just loved the feeling of raw power and that fact that my code didn't break at that scale :P
So what's yours ?
Top comments (0)