Get C2C/W2 Jobs & hotlist update

Required Senior Data Engineer C2C jobs at Columbus, OH

 

 

Role: Senior Data Engineer

Location: Columbus, OH (Onsite)

Experience: 10+ Years

Duration: Long Term

Job Description:

Senior Data Engineer, Strong Azure and ADF, hands-on with Snowflake

  • Design, develop and maintain scalable data pipelines
  • Develop data ingestion and integrations (REST, SOAP, SFTP, MQ, etc.) processes
  • Take ownership of building data pipelines
  • Actively engage in technology discovery and implementation for both on-prem and in Cloud (i.e. Azure or AWS) to build solution for future systems
  • Develop high performance scripts in SQL/Python/etc. to achieve objectives of enterprise data, BI and analytics need.
  • Incorporate standards and best practices into engineering solutions
  • Manage code versions in source control and coordinate changes across team
  • Participate in architecture design and discussions
  • Provide logical and physical data design, and database modeling
  • Be part of the Agile team to collaborate and to help shape requirements
  • Solve complex data issues around data integration, unusable data elements, unstructured data sets, and other data processing incidents
  • Supports the development and design of the internal data integration framework
  • Works with system owners to resolve source data issues and refine transformation rules
  • Partner with enterprise teams, data scientist, architects to define requirements and solution

 Key Qualifications :

  • Have a B.A./B.S. and 5-8 years of relevant work experience; or an equivalent in education and experience
  • Must have excellent experience with Snowflake
  • Hands on experience with Microsoft Stack – SSIS, SQL, etc.
  • Possess strong analytical skills with the ability to analyze raw data, draw conclusions, and develop actionable recommendations
  • Experience with the Agile development process preferred
  • Proven track-record of excellence and consistently delivered past project successfully
  • Hands on experience with Azure data factory V2, Azure Databricks, SQLDW or Snowflake, Azure analysis services and Cosmos DB
  • Experience with Python or Scala.
  • Understanding of continuous integration and continuous deployment on Azure
  • Experience with large scale data lake or warehouse implementation on any of the public cloud (AWS, Azure, GCP)
  • Have excellent interpersonal and written/verbal communication skills
  • Manage financial information in a confidential and professional manner
  • Be highly motivated and flexible
  • Effectively handle multiple projects simultaneously and pay close attention to detail
  • Have experience in a multi-dimensional data environment

 kumar@itvisiongroup.com

 
🔔 Get our daily C2C jobs / Hotlist notifications on WHATSAPP CHANNEL

About Author

JOHN KARY graduated from Princeton University in New Jersey and backed by over a decade, I am Digital marketing manager and voyage content writer with publishing and marketing excellency, I specialize in providing a wide range of writing services. My expertise encompasses creating engaging and informative blog posts and articles.
I am committed to delivering high-quality, impactful content that drives results. Let's work together to bring your content vision to life.

Leave a Reply

Your email address will not be published. Required fields are marked *