We Are Hiring:
Are you looking for Permanent job?
Talk to us for your career move.
We are always on a lookout for the best talent in IT to hire.
Career at Maccadin - India
Enjoy Fun and frolic Fridays.
Exciting rewards on meeting your targets
Work in a fun loving environment, Race for your target and Win fabulous rewards,staffing and bonus.
- Technical Recruiters in Delhi (India).
- Staffing Sales Associates/Leads in Delhi (India).
Our Referral program:
Refer good resources for contract or permanent opportunities with our clients and get:
- Excellent referral bonus for perm hiring.
- Recurring referral for contract hiring.
Big Data Solutions Architect
Location:Santa Clara, CA
- 8-10+ years of Overall experience with at least 2 years of hard core experience in designing & deploying large scale Big Data(Hadoop) & BI solutions.
- 4+ Years of experience working on an end to end large ETL/DW project.
- Full understanding of Extract Transform Load (ETL) concepts and technologies & DW concepts.
- Should have architected Large scale ETL Mappings & transformations. Implemented Medium to Large scale DW(5TB).
- Good at understanding/designing schemas for Operational, Staging & Warehouse.
- Should be a Cloudera Certified Hadoop Developer &/or Cloudera Certified Hadoop Administration Certification.
- Should have worked on Design & Architecture on Hadoop and have successfully implemented at least 2 LARGE end to end projects on Hadoop in PRODUCTION.
- Must have hands-on experience managing at least a 20+ node Hadoop cluster.
- Hands-on experience with the Hadoop stack (e.g. MapReduce, Sqoop, Pig, Hive, Hbase, Flume)
- Hands-on experience with ETL (Extract-Transform-Load) tools (e.g. Informatica, Talend, Pentaho).
- Hands-on experience with "productionalizing" Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning)
- Excellent knowledge of NOSQL databases(HBase, MongoDB, etc).
- Excellent knowledge on Architecture, Frameworks & Specifications: Java/J2EE, OOAD, Design Pattern<
- Previous experience with high-scale or distributed RDBMS (Teradata, Netezza, Greenplum, Aster Data, Vertical) will be a HUGE plus
- Excellent knowledge on Hadoop cluster configuration - Expertise needed on Hadoop scheduler, file system, consistency, and processing infrastructure
- Well versed in installing & managing Cloudera distribution of Hadoop (CDH3, CDH4,Cloudera manager, MapR, Hortonworks etc.).
- Install and configure Hadoop based monitoring tools (NagiOS, Ganglia etc.).
- Quick learner & should be able to do design POCs on new technologies in quick time.
- Experience in Pre-Sales, Solutioning would be an added advantage.
Title:Big Data Solutions Architect
Location:Santa Clara, CA