Faisal has an overall experience of 15 years in the IT industry. He started his career as a J2EE software developer in 2016 and evolved into a J2EE architect afterwards. In 2012, he got interested in the field of Big Data & Analytics and has been working in this field ever since. He has architected and implemented numerous data lake projects and data pipelines in Europe and the US. He has trained 100s of IT/Non-IT professionals on the topic of Big Data. He has expertise in several tools such as: Apache Spark, Storm, Flink, Sqoop, Nifi, Kafka, HDFS, YARN, Ambari, Hive, Impala, Databricks, Sentry, Ranger, MapReduce, Oozie. He programs in Java and Python. He mainly works on scoping studies, Big Data architecture, Data governance, project design and management. He works in agile mode and responsible for end-to-end delivery of projects from technical as well as management perspectives. He manages the Digifloat in Asia.