US Citizen
Green Card
EAD (OPT/CPT/GC/H4)
H1B Work Permit
Corp-Corp
W2-Permanent
W2-Contract
Contract to Hire
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 4th Sep 2023
Requirements:
• Bachelor’s degree in Computer Science, Computer Engineering or a software related discipline. A Master’s degree in a related field is an added plus
• 6 + years of experience in Data Warehouse and Hadoop/Big Data
• 3+ years of experience in strategic data planning, standards, procedures, and governance
• 4+ years of hands-on experience in Python or Scala
• 4+ years of experience in writing and tuning SQLs, Spark queries
• 3+ years of experience working as a member of an Agile team
• Experience with Kubernetes and containers is a plus
• Experience in understanding and managing Hadoop Log Files.
• Experience in understanding Hadoop multiple data processing engines such as interactive SQL, real time streaming, data science and batch processing to handle data stored in a single platform in Yarn.
• Experience in Data Analysis, Data Cleaning (Scrubbing), Data Validation and Verification, Data Conversion, Data Migrations and Data Mining.
• Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment., ETL Flow
• Experience in architecting, designing, installation, configuration and management of Apache Hadoop Clusters
• Experience in analyzing data in HDFS through Map Reduce, Hive and Pig
• Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
• Strong analytic skills related to working with unstructured datasets
• Experience in Migrating Big Data Workloads
• Experience with data pipeline and workflow management tools: Airflow
• Experience with scripting languages: Python, Scala, etc.
• Cloud Administration