Get C2C/W2 Jobs & hotlist update

Ab Initio ETL Developer C2C jobs (In Person interview if client needed ) :: Onsite

Ab Initio ETL Developer (In Person interview if client needed )

***LOCATION: Dallas, TX. Onsite.

Contract

 

Required skills:
Please submit only candidates who are authorized to work in the United States.
Only applicants who are currently local to Dallas Texas or are willing to relocate will be considered.
· Design, develop, and deploy ETL processes using Ab Initio GDE.
· Build high-performance data integration and transformation pipelines.
· Work with Ab Initio Co-Operating System, EME (Enterprise Meta Environment), and metadata-driven development. · Develop and optimize graphs for batch and real-time processing. · Integrate with RDBMS (Oracle, SQL Server, Teradata, DB2, etc.) and external data sources. · Implement continuous flows, web services, and message-based integration with Ab Initio. o Continuous Flows (Co-Op & GDE)

Nice to have skills:
· Exposure to AWS, Azure, or GCP for cloud-based data solutions.
· Experience with big data ecosystems (Hadoop, Spark, Hive, Kafka) is a strong plus.
· Containerization (Docker, Kubernetes) knowledge desirable.
· Monitoring & Security:
· Job monitoring and scheduling experience (Control-M, Autosys, or similar).
· Familiarity with security standards, encryption, and access management.

 

Skills: ·· Design, develop, and deploy ETL processes using Ab Initio GDE.
· Build high-performance data integration and transformation pipelines.
· Work with Ab Initio Co-Operating System, EME (Enterprise Meta Environment), and metadata-driven development.
· Develop and optimize graphs for batch and real-time processing.
· Integrate with RDBMS (Oracle, SQL Server, Teradata, DB2, etc.) and external data sources.
· Implement continuous flows, web services, and message-based integration with Ab Initio.
o Continuous Flows (Co-Op & GDE)
o Plans and Psets
o Conduct-It for job scheduling and orchestration
o Graphs and Parameter Sets
Nice to have:
· Exposure to AWS, Azure, or GCP for cloud-based data solutions. · Experience with big data ecosystems (Hadoop, Spark, Hive, Kafka) is a strong plus. · Containerization (Docker, Kubernetes) knowledge is desirable. · Monitoring & Security: · Job monitoring and scheduling experience (Control-M, Autosys, or similar). · Familiarity with security standards, encryption, and access management.

mikerecruiter3@gmail.com

:
:
:
:
:


🔔 Get our daily C2C jobs / Hotlist notifications on WHATSAPP

About Author

I’m Monica Kerry, a passionate SEO and Digital Marketing Specialist with over 9 years of experience helping businesses grow their online presence. From SEO strategy, keyword research, content optimization, and link building to social media marketing and PPC campaigns, I specialize in driving organic traffic, boosting rankings, and increasing conversions. My mission is to empower brands with result-oriented digital marketing solutions that deliver measurable success.

Leave a Reply

Your email address will not be published. Required fields are marked *

×

Post your C2C job instantly

Quick & easy posting in 10 seconds

Keep it concise - you can add details later
Please use your company/professional email address
Simple math question to prevent spam