Sameer Farooqui Expert Profile

Photo
Sameer is a freelance Big Data specialist, with deep industry expertise in the Hadoop domain. For the past five years, he has deployed various clustering software packages internationally to clients including fortune 500, governments, hospitals and banks. Currently Sameer provides NoSQL training to corporations 80% of the time and does consulting 20% of the time. In the past year, he has taught over 50 courses, of which about 30 were on-site. All of the curriculum Sameer teaches from was custom developed by him along with the slides and labs.

Most recently he was a Systems Architect at Hortonworks where he specialized in designing Hadoop prototypes and Proof-of-Concept use cases. While at Hortonworks, Sameer also taught Hadoop Developer's classes and visited various customers as a sales engineer to brainstorm use cases. The core Hadoop products he specializes in are HDFS, MapReduce, Pig, Hive, HBase and Zookeeper.

Previously, Sameer worked at Accenture's Silicon Valley R&D lab where he was responsible for studying NoSQL databases, Cloud Computing and MapReduce for their commercial applicability to emerging big data problems. At Accenture Tech Labs, Sameer was the lead engineer for creating a 32-node prototype using Cassandra and AWS to host 10 TB of Smart Grid data. He also worked on a 30+ person team in the design phase of a multi-environment Hadoop cluster pilot project at NetApp.

Before Hortonworks and Accenture, Sameer spent five years at Symantec where he deployed Veritas Clustering and Storage Foundation solutions (VCS, VVR, SF-HA) to Fortune 500 and government clients throughout North America.

Sameer is a regular speaker at Big Data conferences and meetups.

Speaking Engagements:

30-minute sample from Sameer's Hadoop training class: