US Citizen
Green Card
EAD (OPT/CPT/GC/H4)
H1B Work Permit
Corp-Corp
Contract to Hire
Consulting/Contract
UG :- - Not Required
PG :- - Not Required
No of position :- ( 1 )
Post :- 26th Jan 2023
GCP, Bigdata, Teradata
Detailed Job Description
• Overall 8+ years of professional IT and around 5 years of expertise in Bigdata using Hadoop framework Analysis, Design, Development, Documentation, Deployment, and Integration using SQL and Big Data technologies.
• 5+ years’ experience with Google Cloud Services such as Streaming + Batch, Cloud Storage, Cloud Dataflow, Cloud Pub/Sub , Cloud Composer , Data Proc , DFunc, Big Query & Big Table
• Responsible for building scalable distributed data solutions using Hadoop.
• Experience in implementing various Big Data Analytical, Cloud Data engineering, Data Warehouse/ Data Mart, Data Visualization, Reporting, Data Quality, and Data virtualization solutions.
• Experience in providing ETL solutions for any type of business model.
• Created procedures, macros in Teradata
• Experience in moving high and low volume data objects from Teradata and Hadoop to snowflake.
• Develop the script files for processing data and loading to HDFS. Written CLI commands using HDFS. Develop the UNIX shell scripts for creating the reports from Hive data.
• Experience in writing SQL queries, PL/SQL programming and Query Level Performance tuning.
• In Depth understanding and usage of TERADATA OLAP functions. Proficient in TERADATA SQL, Stored Procedures, Macros, Views, Indexes Primary, Secondary, PPI, Join indexes etc.
• Hands on experience with different programming languages such as Java, Python, Scala. Experience in using different Hadoop eco system components such as HDFS, YARN, MapReduce, Spark, Pig, Sqoop, Hive, Impala, HBase, Kafka, and Crontab tools