Get C2C/W2 Jobs & hotlist update

Enterprise Data Warehouse || Columbus, Ohio || Local to OH || C2C role

Job Title: Enterprise Data Warehouse

Location:  Columbus, Ohio (Onsite) Need local

Duration : Long Term Contract

Interview: Virtual

Visa :: USC/GC

Need 2 references

  

Technical Specialist for critical ODM Enterprise Data Warehouse. EDW M&O to migrate the production data (Weekly, Monthly, and yearly) from the current Big Data Environment to Snowflake Environment, run ELT jobs, and check the data quality from disparate data sources in Snowflake Environment to achieve  strategic and long-term business goals.

 

•Participate in Team activities, Design discussions, Stand up meetings and planning Review with team.

•Provide Snowflake database technical support in developing reliable, efficient, and scalable solutions for various projects on Snowflake.

•Ingest the existing data, framework and programs from EDW IOP Big data environment to the EDW Snowflake environment using the best practices.

•Design and develop Snowpark features in Python, understand the requirements and iterate.

•Interface with the open-source community and contribute to Snowflake’s open-source libraries including Snowpark Python and the Snowflake Python Connector.

•Create, monitor, and maintain role-based access controls, Virtual warehouses, Tasks, Snow pipe, Streams on Snowflake databases to support different use cases.

•Performance tuning of Snowflake queries and procedures. Recommending and documenting the best practices of Snowflake.

•Explore the new capabilities of Snowflake, perform POC and implement them based on business requirements.

•Responsible for creating and maintaining the Snowflake technical documentation, ensuring compliance with data governance and security policies.

•Implement Snowflake user /query log analysis, History capture, and user email alert configuration.

•Enable data governance in Snowflake, including row/column-level data security using secure views and dynamic data masking features.

•Perform data analysis, data profiling, data quality and data ingestion in various layers using big data/Hadoop/Hive/Impala queries, PySpark programs and UNIX shell scripts.

•Follow the organization coding standard document, Create mappings, sessions and workflows as per the mapping specification document.

•Perform Gap and impact analysis of ETL and IOP jobs for the new requirement and enhancements.

•Create mockup data, perform Unit testing and capture the result sets against the jobs developed in lower environment.

•Updating the production support Run book, Control M schedule document as per the production release.

•Create and update design documents, provide detail description about workflows after every production release.

•Continuously monitor the production data loads, fix the issues, update the tracker document with the issues, Identify the performance issues.

•Performance tuning long running ETL/ELT jobs by creating partitions, enabling full load and other standard approaches.

•Perform Quality assurance check, Reconciliation post data loads and communicate to vendor for receiving fixed data.

•Participate in ETL/ELT code review and design re-usable frameworks.

•Create Change requests, workplan, Test results, BCAB checklist documents for the code deployment to production environment and perform the code validation post deployment.

•Work with Snowflake Admin, Hadoop Admin, ETL and SAS admin teams for code deployments and health checks.

•Create re-usable framework for Audit Balance Control to capture Reconciliation, mapping parameters and variables, serves as single point of reference for workflows.

•Create Snowpark and PySpark programs to ingest historical and incremental data.

•Create SQOOP scripts to ingest historical data from EDW oracle database to Hadoop IOP, created HIVE tables and Impala views creation scripts for Dimension tables.

•Participate in meetings to continuously upgrade the Functional and technical expertise.

 

REQUIRED Skill Sets: 

 

Required Education: BS/BA degree or combination of education & experience.

   

Skill

Required / Desired

Amount

Number of years

Experience in Hadoop, mapreduce, Sqoop, pyspark, Spark, HDFS, Hive, Impala, streamsets, Kudu, Oozie, Hue, Kafka, Yarn, Python, Flume, Zookeeper, Sentr

Required

9

 

Strong development experience in creating Sqoop scripts, pyspark programs, HDFS commands, HDFS file formats

Required

9

 

Writing Hadoop/Hive/Impala scripts for gathering stats on table post data loads.

Required

9

 

Hands-on experience with Cloud databases.

Required

6

 

Hands-on data migration experience from the Big data environment to Snowflake environment.

Required

3

 

Hands-on experience with the Snowflake platform along with Snowpipe and Snowpark.

Required

3

 

BS/BA degree or combination of education & experience.

Required

 

 

In addition, to overall Snowflake experience, a candidate should have experience in the development work in both the Snowpipe and Snowpark.    

Desired

 

 

Experience with Data Migration from Big data environment to Snowflake environment.

Desired

 

 

Strong understanding of Snowflake capabilities like Snowpipe, STREAMS, TASKS etc.

Desired

 

 

Knowledge of security (SAML, SCIM, OAuth, OpenID, Kerberos, Policies, entitlements etc.).

Desired

 

 

Has experience with System DRP for Snowflake systems

Desired

 

 

Demonstrate effective leadership, analytical and problem-solving skills.

Desired

 

 

Required excellent written and oral communication skills with technical and business teams

Desired

 

 

Ability to work independently, as well as part of a team.

Desired

 

 

Stay abreast of current technologies in area of IT assigned.

Desired

 

 

Establish facts and draw valid conclusions.

Desired

 

 

Recognize patterns and opportunities for improvement throughout the entire organization.

Desired

 

 

Ability to discern critical from minor problems and innovate new solutions.

Desired

 

 

 

 

 

Thanks & regards!

Sonu Chauhan

Sr. Technical Recruiter

Phone: +1 3027820229  Ext. 111

Email: Sonu@virishatech.com

Address: 600 N Broad Street Suite 5 #269, Middletown, DE 19709 |

 

 


🔔 Get our daily C2C jobs / Hotlist notifications on WHATSAPP

About Author

JOHN KARY graduated from Princeton University in New Jersey and backed by over a decade, I am Digital marketing manager and voyage content writer with publishing and marketing excellency, I specialize in providing a wide range of writing services. My expertise encompasses creating engaging and informative blog posts and articles.
I am committed to delivering high-quality, impactful content that drives results. Let's work together to bring your content vision to life.

Leave a Reply

Your email address will not be published. Required fields are marked *