Dr. Elan Sasson, datascience.co.il
Hi, Elan could you briefly introduce yourself to our readers?
I hold a PhD degree in Information Systems specializing in Machine Learning and Temporal Text Mining applications. I am an adjunct lecturer at Tel Aviv University - the M.Sc. program in Business Analytics (Engineering Faculty) and the EMBA program (Recanati School of Management). I Teach courses in big data analytics, data science, and machine learning. I am also, a member of the steering committee at Tel Aviv University – the laboratory of AI, machine learning, and analytics.
As a long-time Hi-tech entrepreneur, I have 27 years of experience in development of IT/IS products, software projects as well, and managing startups in Israel and abroad. I also serve on the boards of several Hi-Tech companies and I am the CEO of Data Science Group in Israel.
We can see that you have been involved with data science on many different levels. When was the first time you heard about the field and how did you end up working in it?
I first encountered the field of data science approximately 8 years ago and ever since then, my passion for it has only grown. I am privileged to have been working extensively with various data science projects and endeavours. Fortunately, I’ve worked in diverse fields and have relevant academic research and teaching experience.
You are one of the founders of Data Science Group (DSG). How did you come up with the idea to create the company and what problems do you solve for your clients?
The realization that data science and machine learning are becoming the lingua franca in the business environment coupled with the shortage of seasoned data scientists led me to co-found Data Science Group 3.5 years ago.
We have substantial experience in managing a wide range of projects – including fin-tech, ad-tech, high-tech, cloud computing, risk assessment etc.
DSG is a multidisciplinary lab-oriented group. We specialize in the use of scientific tools and methodologies. We extract and utilize knowledge and insights from information (data) in various formats employed by organizations and companies. We have substantial experience in managing a wide range of projects – including fin-tech, ad-tech, high-tech, cloud computing, risk assessment etc. Examples of projects in a range of areas include CTR predicting, customer profiling, credit scoring, digital campaign optimization, architecture of recommendation engine, smart cities, medical/healthcare and many more.
What is so unique about your company?
The DSG Lab uniquely combines the dynamic and the focus of a technological startup with a long-term and rigorous view of academic research in developing big data analytics and data science projects based on a novel agile methodology.
Finally, due to the inherently multidisciplinary nature of data science, we believe that no single data scientist can become an expert in each sub-field or issue. Therefore, we have chosen to work collaboratively on every project, thus fully integrating the speciality fields of our various team members.
That's truly great! Which is the area of improvement that is typically the easiest to address and allows you to achieve some quick-wins when you start working with a new company?
Surprisingly, many of our projects share similar attributes despite originating from different domains. Trivial commonalities are evident in the employed algorithms, platforms, and tools. But more important similarities lie in the life cycles of data science projects – from inception to production. I would claim that a successful data science project likely stems not from the technical skills mentioned above, but from something far more fundamental to classic scientific methods and research design.
Why do you believe this is the case?
It's all about defining the theoretical research problem and its operationalization. The world is not Kaggle. If this process is not done correctly, you will invariably find yourself answering the wrong question, or not answering any question at all.
The theoretical research problem, or business problem, need not be defined in vague terms, but rather using concise and coherent definitions easily comprehensible to non-technical people. This seems trivial but is a point often neglected.
Which tools and software are essential for your team?
Our team focuses on leading projects from conception, through problem identification, to the final product. We use the most cutting-edge tools to do so, including R, Python, Spark, H2O, Vowpal Wabbit, TensorFlow and others. Generally, speaking we are unbiased towards the use of different technological tools to solve different problems.
We believe there is no 'one size fits all' and thus we consider ourselves technology agnostics.
Can you think of a situation when you have worked with a given company and you felt especially proud of what you helped them achieve?
We have adopted a very well-defined code of ethics and therefore we do not engage in projects involving gambling, binary options, malwares and other applications which do not benefit humanity.
We are proud of being involved in breakthrough medical and healthcare applications which hold a big promise to make a substantial change in the public health sector. To name a few we work with a company which provides real-time predictive services of erroneous medical prescriptions for healthcare providers. For another company, we developed a prediction model for estimation of cannabinoid percent by analysis of (cannabis) spectrographic data. We also developed a supervised learning algorithm for a company specializing in a cancer screening method using an ordinary blood test.
Tip of the interview. Is there a nifty tool that you discovered or were introduced to which you now can’t live without and want to share with aspiring data scientists?
For the last few years, we have engaged in various data science projects in diverse domains.
Tools arise and fade away – it’s an evolutionary process with fast pace changes and trends so I would like to focus on a working methodology that we have developed rather than a nifty tool. We are using a unique agile data science methodology and phased approach which is based on our gained concrete practical experience. For the last few years, we have engaged in various data science projects in diverse domains.
Our manifesto of agile data science is based on:
- Engaging individuals and interactions over processes and tools.
- Working models with comprehensive documentation.
- Business stakeholders’ collaboration in the process
- Dynamically adapting to changes over the course of the data science project.
Every time we conduct these interviews, we finish with some nerdery. What is the one nerdy thing you would like to share with the world?
We as human beings must consider and prepare for the unavoidable rise of a super intelligent entity that will be the last invention made by humans...
Pun of the interview. Lastly, we leave our readers with a joke. What’s yours?
Espresso???
But I ordered a cappuccino!!
Don’t worry, in the embedding space, they’re almost the same thing….
Aaaa …That makes sense…